Lunes, Mayo 20, 2013

Hardware

Computer Hardware Definition

Hardware is a comprehensive term for all of the physical parts of a computer, as distinguished from the data it contains or operates on, and the software that provides instructions for the hardware to acoomplish tasks. The boundary between hardware and software is slightly blurry - firmware is software that is "built-in" to the hardware, but such firmware is usually the province of computer programmers and computer engineers in any case and not an issue that computer users need to concern themselves with.
A typical computer (Personal Computer, PC) contains in a desktop or tower case the following parts:
  • Motherboard which holds the CPU, main memory and other parts, and has slots for expansion cards
  • power supply - a case that holds a transformer, voltage control and fan
  • storage controllers, of IDE, SCSI or other type, that control hard disk , floppy disk, CD-ROM and other drives; the controllers sit directly on the motherboard (on-board) or on expansion cards
  • graphics controller that produces the output for the monitor
  • the hard disk, floppy disk and other drives for mass storage
  • interface controllers (parallel, serial, USB, Firewire) to connect the computer to external peripheral devices such as printers or scanners

Software

Computer Software Definition

Software is a generic term for organized collections of computer data and instructions, often broken into two major categories: system software that provides the basic non-task-specific functions of the computer, and application software which is used by users to accomplish specific tasks.
System software is responsible for controlling, integrating, and managing the individual hardware components of a computer system so that other software and the users of the system see it as a functional unit without having to be concerned with the low-level details such as transferring data from memory to disk, or rendering text onto a display. Generally, system software consists of an operating system and some fundamental utilities such as disk formatters, file managers, display managers, text editors, user authentication (login) and management tools, and networking and device control software.
Application software, on the other hand, is used to accomplish specific tasks other than just running the computer system. Application software may consist of a single program, such as an image viewer; a small collection of programs (often called a software package) that work closely together to accomplish a task, such as a spreadsheet or text processing system; a larger collection (often called a software suite) of related but independent programs and packages that have a common user interface or shared data format, such as Microsoft Office, which consists of closely integrated word processor, spreadsheet, database, etc.; or a software system, such as a database management system, which is a collection of fundamental programs that may provide some service to a variety of other independent applications.
Software is created with programming languages and related utilities, which may come in several of the above forms: single programs like script interpreters, packages containing a compiler, linker, and other tools; and large suites (often called Integrated Development Environments) that include editors, debuggers, and other tools for multiple languages.

Basic Parts of Computer


Basic Parts of a Computer

Basic Parts of a Computer

http://tx.english-ch.com/teacher/aisa/Computer.jpg

What is COMPUTER?

A computer is a device that accepts information (in the form of digitalized data) and manipulates it for some result based on a program or sequence of instructions on how the data is to be processed.

THE BASIC PARTS OF COMPUTER


Disk - a disk is a piece of plastic that holds information for or from your computer.

C.D. - a round circle holds information for or from your computer.

Hard Drive - hard drives are machines that have alot of memory to save your work and carries a c.d.rom and a floppy disk.

Keyboard - a keyboard is a tool that is used for typing like a typewriter.

Monitors - a monitor is the screen on your computer that you look at when you are on the computer.

Mouse - A piece of plastic that has a ball on the bottom and two buttons on the top. When you click on the mouse, you usually click on the left button. It allows you to click on and choose things on your screen.

Printer - A machine that puts things from the computer onto paper.

Scanner- A piece of equipment that copies pictures so that you can use it in your computer projects.

Speakers - The part of the computer that lets you hear the sounds from the programs.

Modem - A part of the computer that connects to the phone lines so that you can go on the Internet.

Chip- A small piece inside the computer that helps your computer work. Chips have to be programmed by people or they won't work. There are many chips in a computer.

Motherboard- A main board of the computer that has many chips on it. The motherboard makes the computer work. It also is where the memory and the processing are found.

CPU (Central Processing Unit)- A chip that is the "brains" of your computer that processes the information.

History of Computer

What is a Computer?

History of The ComputerIn its most basic form a computer is any device which aids humans in performing various kinds of computations or calculations. In that respect the earliest computer was the abacus, used to perform basic arithmetic operations.
Every computer supports some form of input, processing, and output. This is less obvious on a primitive device such as the abacus where input, output and processing are simply the act of moving the pebbles into new positions, seeing the changed positions, and counting. Regardless, this is what computing is all about, in a nutshell. We input information, the computer processes it according to its basic logic or the program currently running, and outputs the results.
Modern computers do this electronically, which enables them to perform a vastly greater number of calculations or computations in less time. Despite the fact that we currently use computers to process images, sound, text and other non-numerical forms of data, all of it depends on nothing more than basic numerical calculations. Graphics, sound etc. are merely abstractions of the numbers being crunched within the machine; in digital computers these are the ones and zeros, representing electrical on and off states, and endless combinations of those. In other words every image, every sound, and every word have a corresponding binary code.
While abacus may have technically been the first computer most people today associate the word “computer” with electronic computers which were invented in the last century, and have evolved into modern computers we know of today.
History of The Computer
ENIAC

First Generation Computers (1940s – 1950s)

First electronic computers used vacuum tubes, and they were huge and complex. The first general purpose electronic computer was the ENIAC (Electronic Numerical Integrator And Computer). It was digital, although it didn’t operate with binary code, and was reprogrammable to solve a complete range of computing problems. It was programmed using plugboards and switches, supporting input from an IBM card reader, and output to an IBM card punch. It took up 167 square meters, weighed 27 tons, and consuming 150 kilowatts of power. It used thousands of vacuum tubes, crystal diodes, relays, resistors, and capacitors.
The first non-general purpose computer was ABC (Atanasoff–Berry Computer), and other similar computers of this era included german Z3, ten British Colossus computers, LEO, Harvard Mark I, and UNIVAC.
History of The Computer
IBM 1401

Second Generation Computers (1955 – 1960)

The second generation of computers came about thanks to the invention of the transistor, which then started replacing vacuum tubes in computer design. Transistor computers consumed far less power, produced far less heat, and were much smaller compared to the first generation, albeit still big by today’s standards.
The first transistor computer was created at the University of Manchester in 1953. The most popular of transistor computers was IBM 1401. IBM also created the first disk drive in 1956, the IBM 350 RAMAC.

Third Generation Computers (1960s)

History of The Computer
IBM System/360
The invention of the integrated circuits (ICs), also known as microchips, paved the way for computers as we know them today. Making circuits out of single pieces of silicon, which is a semiconductor, allowed them to be much smaller and more practical to produce. This also started the ongoing process of integrating an ever larger number of transistors onto a single microchip. During the sixties microchips started making their way into computers, but the process was gradual, and second generation of computers still held on.
First appeared minicomputers, first of which were still based on non-microchip transistors, and later versions of which were hybrids, being based on both transistors and microchips, such as IBM’s System/360. They were much smaller, and cheaper than first and second generation of computers, also known as mainframes. Minicomputers can be seen as a bridge between mainframes and microcomputers, which came later as the proliferation of microchips in computers grew.

Fourth Generation Computers (1971 – present)

First microchips-based central processing units consisted of multiple microchips for different CPU components. The drive for ever greater integration and miniaturization led towards single-chip CPUs, where all of the necessary CPU components were put onto a single microchip, called a microprocessor. The first single-chip CPU, or a microprocessor, was Intel 4004.
The advent of the microprocessor spawned the evolution of the microcomputers, the kind that would eventually become personal computers that we are familiar with today.

First Generation of Microcomputers (1971 – 1976)

History of The Computer
Altair 8800
First microcomputers were a weird bunch. They often came in kits, and many were essentially just boxes with lights and switches, usable only to engineers and hobbyists whom could understand binary code. Some, however, did come with a keyboard and/or a monitor, bearing somewhat more resemblance to modern computers.
It is arguable which of the early microcomputers could be called a first. CTC Datapoint 2200 is one candidate, although it actually didn’t contain a microprocessor (being based on a multi-chip CPU design instead), and wasn’t meant to be a standalone computer, but merely a terminal for the mainframes. The reason some might consider it a first microcomputer is because it could be used as a de-facto standalone computer, it was small enough, and its multi-chip CPU architecture actually became a basis for the x86 architecture later used in IBM PC and its descendants. Plus, it even came with a keyboard and a monitor, an exception in those days.
However, if we are looking for the first microcomputer that came with a proper microprocessor, was meant to be a standalone computer, and didn’t come as a kit then it would be Micral N, which used Intel 8008 microprocessor.
Popular early microcomputers which did come in kits include MOS Technology KIM-1, Altair 8800, and Apple I. Altair 8800 in particular spawned a large following among the hobbyists, and is considered the spark that started the microcomputer revolution, as these hobbyists went on to found companies centered around personal computing, such as Microsoft, and Apple.

Second Generation Microcomputers (1977 – present)

History of The Computer
Commodore PET2001 (Image by Tomislav Medak licensed under CC-BY-SA).
As microcomputers continued to evolve they became easier to operate, making them accessible to a larger audience. They typically came with a keyboard and a monitor, or could be easily connected to a TV, and they supported visual representation of text and numbers on the screen.
In other words, lights and switches were replaced by screens and keyboards, and the necessity to understand binary code was diminished as they increasingly came with programs that could be used by issuing more easily understandable commands. Famous early examples of such computers include Commodore PET, Apple II, and in the 80s the IBM PC.
The nature of the underlying electronic components didn’t change between these computers and modern computers we know of today, but what did change was the number of circuits that could be put onto a single microchip. Intel’s co-founder Gordon Moore predicted the doubling of the number of transistor on a single chip every two years, which became known as “Moore’s Law”, and this trend has roughly held for over 30 years thanks to advancing manufacturing processes and microprocessor designs.
The consequence was a predictable exponential increase in processing power that could be put into a smaller package, which had a direct effect on the possible form factors as well as applications of modern computers, which is what most of the forthcoming paradigm shifting innovations in computing were about.

Graphical User Interface (GUI)

History of The Computer
Macintosh 128k (Image by All About Apple museum licensed under CC-BY-SA-2.5-it)
Possibly the most significant of those shifts was the invention of the graphical user interface, and the mouse as a way of controlling it. Doug Engelbart and his team at the Stanford Research Lab developed the first mouse, and a graphical user interface, demonstrated in 1968. They were just a few years short of the beginning of the personal computer revolution sparked by the Altair 8800 so their idea didn’t take hold.
Instead it was picked up and improved upon by researchers at the Xerox PARC research center, which in 1973 developed Xerox Alto, the first computer with a mouse-driven GUI. It never became a commercial product, however, as Xerox management wasn’t ready to dive into the computer market and didn’t see the potential of what they had early enough.
It took Steve Jobs negotiating a stocks deal with Xerox in exchange for a tour of their research center to finally bring the user friendly graphical user interface, as well as the mouse, to the masses. Steve Jobs was shown what Xerox PARC team had developed, and directed Apple to improve upon it. In 1984 Apple introduced the Macintosh, the first mass-market computer with a graphical user interface and a mouse.
Microsoft later caught on and produced Windows, and the historic competition between the two companies started, resulting in improvements to the graphical user interface to this day.
Meanwhile IBM was dominating the PC market with their IBM PC, and Microsoft was riding on their coat tails by being the one to produce and sell the operating system for the IBM PC known as “DOS” or “Disk Operating System”. Macintosh, with its graphical user interface, was meant to dislodge IBM’s dominance, but Microsoft made this more difficult with their PC-compatible Windows operating system with its own GUI.

Portable Computers

History of The Computer
Powerbook 150 (Image by Dana Sibera licensed under CC-BY-SA.)
As it turned out the idea of a laptop-like portable computer existed even before it was possible to create one, and it was developed at Xerox PARC by Alan Kay whom called it the Dynabook and intended it for children. The first portable computer that was created was the Xerox Notetaker, but only 10 were produced.
The first laptop that was commercialized was Osborne 1 in 1981, with a small 5″ CRT monitor and a keyboard that sits inside of the lid when closed. It ran CP/M (the OS that Microsoft bought and based DOS on). Later portable computers included Bondwell 2 released in 1985, also running CP/M, which was among the first with a hinge-mounted LCD display. Compaq Portable was the first IBM PC compatible computer, and it ran MS-DOS, but was less portable than Bondwell 2. Other examples of early portable computers included Epson HX-20, GRiD compass, Dulmont Magnum, Kyotronic 85, Commodore SX-64, IBM PC Convertible, Toshiba T1100, T1000, and T1200 etc.
The first portable computers which resemble modern laptops in features were Apple’s Powerbooks, which first introduced a built-in trackball, and later a trackpad and optional color LCD screens. IBM’s ThinkPad was largely inspired by Powerbook’s design, and the evolution of the two led to laptops and notebook computers as we know them. Powerbooks were eventually replaced by modern MacBook Pro’s.
Of course, much of the evolution of portable computers was enabled by the evolution of microprocessors, LCD displays, battery technology and so on. This evolution ultimately allowed computers even smaller and more portable than laptops, such as PDAs, tablets, and smartphones.

Linggo, Mayo 19, 2013

Latest Trends in Fashion