Thursday, April 30, 2009

NEXT WEEK

YOU CAN VIEW NEXT WEEK'S LESSON: CLICK HERE

The Basic -- What is a "Megabyte?"


If you click on this picture you can copy a larger copy of it

OR CLICK HERE


SIMPLE DEFINITION: (BinarY TablE) The common unit of computer storage from desktop computer to mainframe. It is made up of eight binary digits (bits).


In most computer systems, a byte is a unit of data that is eight binary digits long.(See the picture, above A byte is the unit most computers use to represent a character such as a letter, number, or typographic symbol (for example, "g", "5", or "?"). A byte can also hold a string of bits that need to be used in some larger unit for application purposes (for example, the stream of bits that constitute a visual image for a program that displays images or the string of bits that constitutes the machine code of a computer program).


In some computer systems, four bytes constitute a word, a unit that a computer processor can be designed to handle efficiently as it reads and processes each instruction. Some computer processors can handle two-byte or single-byte instructions.


A byte is abbreviated with a "B". (A bit is abbreviated with a small "b".) Computer storage is usually measured in byte multiples. For example, an 820 MB hard drive holds a nominal 820 million bytes - or megabytes - of data. Byte multiples are based on powers of 2 and commonly expressed as a "rounded off" decimal number. For example, one megabyte ("one million bytes") is actually 1,048,576 (decimal) bytes.


(Confusingly, however, some hard disk manufacturers and dictionary sources state that bytes for computer storage should be calculated as powers of 10 so that a megabyte really would be one million decimal bytes.)


PREFIXES FOR BIT AND BYTE MULTIPLES

Value

Symbol

Equals … Bytes

Kilobyte

kB or KB

1,000

Megabyte

MB

1,000,000

Gigabyte

GB

1,000,000,000

Terabyte

TB

1,000,000,000,000

Petabyte

PB

1,000,000,000,000,000


Some language scripts require two bytes to represent a character. These are called double-byte character sets (DBCS).

Computer Speed



MHz and GHz are used to measure the speed of the CPU. For example, a 1.6 GHz computer processes data internally (calculates, compares, etc.) twice as fast as an 800 MHz machine. However, the doubled clock speed of the CPU does not mean twice as much finished work gets done in the same time frame. Internal cache design, bus speed, disk speed, network speed and software design all contribute to the computer's overall processing speed and performance (overall throughput).


Users are often dismayed to find that they only obtain incremental improvements after purchasing a computer rated much faster than their old one. In addition, newer versions of software are often less efficient than previous ones. A faster computer is often required just to maintain the same performance level as the old software.


MHz and GHz Are the Heartbeat


When referencing CPU speed, the megahertz and gigahertz ratings are really the heartbeat of the computer, providing the raw, steady pulses that energize the circuits. If you know German, it's easy to remember this. The word "Herz," pronounced "hayrtz," means heart. This was a coincidence, because in 1883, Heinrich Hertz identified electromagnetic waves.



Monday, April 27, 2009

Apple Personal Computer


The 1977 Apple II, shown here with twin floppy disk drives and a monitor. The Apple II featured an integrated keyboard, sound, a plastic case, and eight internal expansion slots.


Apple

Cupertino, California based high school friends Steve Wozniak and Steve Jobs produced their first computer, the single-board Apple I, in a garage workshop in 1976. After selling 200 or so of the computer, Jobs attracted the attention of some investors, co-founding Apple Computer with Wozniak, and introducing the Apple II computer in 1977.

It was the engineering skill of Wozniak (known affectionately as “Woz”) the marketing ability of Jobs, and the hard work of many of the early employees that contributed to Apple’s early success.

Apple was the first company to mass market the graphical user interface in their Macintosh computer, introduced in 1984, a product that re-defined personal computing.

Apple Computer took the world by storm in 1977 with the first successful personal computer, the Apple II. The II was not only the first successful personal computer, it was the first successful computer to have a keyboard and monitor, a form which has come to be synonymous with "computer" to many. The II sold millions of units, making Apple a billion dollar company.