Thursday, May 7, 2015

Mouse

Mouse (computing)



In computing, a mouse is a pointing device that detects two-dimensional motion relative to a surface. This motion is typically translated into the motion of a pointer on a display, which allows for fine control of a graphical user interface.

Physically, a mouse consists of an object held in one's hand, with one or more buttons. Mice often also feature other elements, such as touch surfaces and "wheels", which enable additional control and dimensional input.

The device was patented in 1947, but only a prototype using a metal ball rolling on two rubber-coated wheels was ever built and the device was kept as a military secret.
Another early trackball was built by British electrical engineer Kenyon Taylor in collaboration with Tom Cranston and Fred Longstaff. Taylor was part of the original Ferranti Canada, working on the Royal Canadian Navy's DATAR (Digital Automated Tracking and Resolving) system in 1952.

DATAR was similar in concept to Benjamin's display. The trackball used four disks to pick up motion, two each for the X and Y directions. Several rollers provided mechanical support. When the ball was rolled, the pickup discs spun and contacts on their outer rim made periodic contact with wires, producing pulses of output with each movement of the ball. By counting the pulses, the physical movement of the ball could be determined. A digital computer calculated the tracks, and sent the resulting data to other ships in a task force using pulse-code modulation radio signals. This trackball used a standard Canadian five-pin bowling ball. It was not patented, as it was a secret military project as well.

Early mouse patents. From left to right: Opposing track wheels by Engelbart, Nov. 1970, U.S. Patent 3,541,541. Ball and wheel by Rider, Sept. 1974, U.S. Patent 3,835,464. Ball and two rollers with spring by Opocensky, Oct. 1976, U.S. Patent 3,987,685
Independently, Douglas Engelbart at the Stanford Research Institute (now SRI International) invented his first mouse prototype in the 1960s with the assistance of his lead engineer Bill English. They christened the device the mouse as early models had a cord attached to the rear part of the device looking like a tail and generally resembling the common mouse. Engelbart never received any royalties for it, as his employer SRI held the patent, which ran out before it became widely used in personal computers. The invention of the mouse was just a small part of Engelbart's much larger project, aimed at augmenting human intellect via the Augmentation Research Center.

Inventor Douglas Engelbart's computer mouse, showing the wheels that make contact with the working surface.
Several other experimental pointing-devices developed for Engelbart's oN-Line System (NLS) exploited different body movements – for example, head-mounted devices attached to the chin or nose – but ultimately the mouse won out because of its speed and convenience. The first mouse, a bulky device (pictured) used two wheels perpendicular to each other: the rotation of each wheel translated into motion along one axis. At the time of the "Mother of All Demos", Englebart's group had been using their second generation, 3-button mouse for about a year. See the image of that mouse at Picture showing 2nd G mouse (A public domain version of this image would be nice.)

Monday, May 4, 2015

Camera

Camera


A camera is an optical instrument for recording images, which may be stored locally, transmitted to another location, or both. The images may be individual still photographs or sequences of images constituting videos or movies. The word camera comes from camera obscura, which means "dark chamber" and is the Latin name of the original device for projecting an image of external reality onto a flat surface. The modern photographic camera evolved from the camera obscura. The functioning of the camera is very similar to the functioning of the human eye.

Functional description

Basic elements of a modern still camera
A camera may work with the light of the visible spectrum or with other portions of the electromagnetic spectrum. A still camera is an optical device which creates a single image of an object or scene, and records it on an electronic sensor or photographic film. All cameras use the same basic design: light enters an enclosed box through a converging lens and an image is recorded on a light-sensitive medium. A shutter mechanism controls the length of time that light can enter the camera. Most photographic cameras have functions that allow a person to view the scene to be recorded, allow for a desired part of the scene to be in focus, and to control the exposure so that it is not too bright or too dim. A display, often a liquid crystal display (LCD), permits the user to view scene to be recorded and settings such as ISO speed, exposure, and shutter speed.

A movie camera or a video camera operates similarly to a still camera, except it records a series of static images in rapid succession, commonly at a rate of 24 frames per second. When the images are combined and displayed in order, the illusion of motion is achieved.

Sunday, May 3, 2015

Laptop or Notebook

Laptop or Notebook



A laptop or a notebook is a portable personal computer with a clamshell form factor, suitable for mobile use. There was a difference between laptops and notebooks in the past, but nowadays it has gradually died away. Laptops are commonly used in a variety of settings, including at work, in education, and for personal multimedia.

A laptop combines the components and inputs of a desktop computer, including display, speakers, keyboard and pointing device (such as a touchpad or a trackpad) into a single device. Most modern-day laptops also have an integrated webcam and a microphone. A laptop can be powered either from a rechargeable battery, or by mains electricity via an AC adapter. Laptop is a diverse category of devices and other more specific terms, such as rugged notebook or convertible, refer to specialist types of laptops, which have been optimized for specific uses. Hardware specifications change significantly between different types, makes and models of laptops.

Portable computers, which later developed into modern laptops, were originally considered to be a small niche market, mostly for specialized field applications, such as the military, accountancy, for sales representatives etc. As portable computers developed and became more like modern laptops, becoming smaller, lighter, cheaper, and more powerful, they became very widely used for a variety of purposes.

Saturday, May 2, 2015

Harddisk Drive


A hard disk drive (HDD), hard disk, hard drive or fixed disk is a data storage device used for storing and retrieving digital information using one or more rigid ("hard") rapidly rotating disks (platters) coated with magnetic material. The platters are paired with magnetic heads arranged on a moving actuator arm, which read and write data to the platter surfaces. Data is accessed in a random-access manner, meaning that individual blocks of data can be stored or retrieved in any order rather than sequentially. An HDD retains its data even when powered off.

Introduced by IBM in 1956, HDDs became the dominant secondary storage device for general-purpose computers by the early 1960s. Continuously improved, HDDs have maintained this position into the modern era of servers and personal computers. More than 200 companies have produced HDD units, though most current units are manufactured by Seagate, Toshiba and Western Digital. Worldwide disk storage revenues were US $32 billion in 2013, down 3% from 2012.

The primary characteristics of an HDD are its capacity and performance. Capacity is specified in unit prefixes corresponding to powers of 1000: a 1-terabyte (TB) drive has a capacity of 1,000 gigabytes (GB; where 1 gigabyte = 1 billion bytes). Typically, some of an HDD's capacity is unavailable to the user because it is used by the file system and the computer operating system, and possibly inbuilt redundancy for error correction and recovery. Performance is specified by the time required to move the heads to a track or cylinder (average access time) plus the time it takes for the desired sector to move under the head (average latency, which is a function of the physical rotational speed in revolutions per minute), and finally the speed at which the data is transmitted (data rate).

The two most common form factors for modern HDDs are 3.5-inch, for desktop computers, and 2.5-inch, primarily for laptops. HDDs are connected to systems by standard interface cables such as SATA (Serial ATA), USB or SAS (Serial attached SCSI) cables.

As of 2015, the primary competing technology for secondary storage is flash memory in the form of solid-state drives (SSDs), but HDDs remain the dominant medium for secondary storage due to advantages in price per unit of storage and recording capacity. However, SSDs are replacing HDDs where speed, power consumption and durability are more important considerations.

Tuesday, April 28, 2015

What is Android?

What is Android?
Android is a mobile operating system (OS) based on the Linux kernel and currently developed by Google. With a user interface based on direct manipulation, Android is designed primarily for touchscreen mobile devices such as smartphones and tablet computers, with specialized user interfaces for televisions (Android TV), cars (Android Auto), and wrist watches (Android Wear). The OS uses touch inputs that loosely correspond to real-world actions, like swiping, tapping, pinching, and reverse pinching to manipulate on-screen objects, and a virtual keyboard. Despite being primarily designed for touchscreen input, it also has been used in game consoles, digital cameras, regular PCs (e.g. the HP Slate 21) and other electronics.

As of July 2013 the Google Play store has had over one million Android applications ("apps") published, and over 50 billion applications downloaded. A developer survey conducted in April–May 2013 found that 71% of mobile developers develop for Android. At Google I/O 2014, the company revealed that there were over one billion active monthly Android users, up from 538 million in June 2013. As of 2015, Android has the largest installed base of all general-purpose operating systems.

Android's source code is released by Google under open source licenses, although most Android devices ultimately ship with a combination of open source and proprietary software, including proprietary software developed and licensed by Google. Initially developed by Android, Inc., which Google backed financially and later bought in 2005, Android was unveiled in 2007 along with the founding of the Open Handset Alliance—​a consortium of hardware, software, and telecommunication companies devoted to advancing open standards for mobile devices.

Android is popular with technology companies which require a ready-made, low-cost and customizable operating system for high-tech devices. Android's open nature has encouraged a large community of developers and enthusiasts to use the open-source code as a foundation for community-driven projects, which add new features for advanced users or bring Android to devices which were officially released running other operating systems. The operating system's success has made it a target for patent litigation as part of the so-called "smartphone wars" between technology companies.

Saturday, April 25, 2015

What is Smartphone?




A smartphone (or smart phone) is a mobile phone with an advanced operating system.Smartphones typically include the features of a phone with those of other popular mobile devices, such as personal digital assistant, media player and GPS navigation unit. Most have a touchscreen interface and can run third-party apps, and are camera phones. Later smartphones add broadband internet web browsing, Wi-Fi, motion sensors and mobile payment mechanisms.

In February 2014, 93% of mobile developers were targeting smartphones first for app development.In 2014, sales of smartphones worldwide topped 1.2 billion, which is up 28% from 2013.

Early years

The first caller identification receiver
Devices that combined telephony and computing were first conceptualized by Theodore G. Paraskevakos in 1971 and patented in 1974, and were offered for sale beginning in 1993. He was the first to introduce the concepts of intelligence, data processing and visual display screens into telephones which gave rise to the "smartphone." In 1971, Paraskevakos, working with Boeing in Huntsville, Alabama, demonstrated a transmitter and receiver that provided additional ways to communicate with remote equipment, however it did not yet have general purpose PDA applications in a wireless device typical of smartphones. They were installed at Peoples' Telephone Company in Leesburg, Alabama and were demonstrated to several telephone companies. The original and historic working models are still in the possession of Paraskevakos.

Forerunners

IBM Simon and charging base (1993))
The first mobile phone to incorporate PDA features was an IBM prototype developed in 1992 and demonstrated that year at the COMDEX computer industry trade show. A refined version of the product was marketed to consumers in 1994 by BellSouth under the name Simon Personal Communicator. The Simon was the first cellular device that can be properly referred to as a "smartphone", although it wasn't called a smartphone in 1994. In addition to its ability to make and receive cellular phone calls, Simon was also able to send and receive faxes and e-mails and included several other apps like address book, calendar, appointment scheduler, calculator, world time clock, and note pad through its touch screen display. Simon is the first smartphone to be incorporated with the features of a PDA

The term "smart phone" appeared in print in 1995, for describing AT&T's "PhoneWriter(TM) Communicator" as a "smart phone".

iPhone & Android

In 2007, Apple Inc. introduced the iPhone, one of the first mobile phones to use a multi-touch interface. The iPhone was notable for its use of a large touchscreen for direct finger input as its main means of interaction, instead of a stylus, keyboard, or keypad typical for smartphones at the time. 2008 saw the release of the first phone to use Android called the HTC Dream (also known as the T-Mobile G1).Android is an open-source platform founded by Andy Rubin and now owned by Google.Although Android's adoption was relatively slow at first, it started to gain widespread popularity in 2010, and now dominates the market.

These new platforms led to the decline of earlier ones. When Microsoft, for instance, started a new OS from scratch, in the form of Windows Phone, Nokia abandoned Symbian and partnered with MS to use Windows Phone on its smartphones. Windows Phone then became the third-most-popular OS. Palm was bought by Hewlett-Packard, turned into webOS which became Open webOS and later sold to LG Electronics. Research in Motion also made a new system from scratch, BlackBerry 10.

The capacitive touchscreen also had a knock-on effect on smartphone form factors. Before 2007 it was common for devices to have a physical numeric keypad or physical QWERTY keyboard in either a candybar or sliding form factor. However, by 2010, there were no top-tier smartphones with physical numeric keypads.

The future

In 2013, the Fairphone company launched its first "socially ethical" smartphone at the London Design Festival to address concerns regarding the sourcing of materials in the manufacturing.
In late 2013, QSAlpha commenced production of a smartphone designed entirely around security, encryption and identity protection
In December 2013, the world's first curved-OLED technology smartphones were introduced to the retail market with the sale of the Samsung Galaxy Round and LG G Flex models. Samsung phones with more bends and folds in the screens are expected this year.
Foldable OLED smartphones could be as much as a decade away because of the cost of producing them. There is a relatively high failure rate when producing these screens. As little as a speck of dust can ruin a screen during production. Creating a battery that can be folded is another hurdle.
A clear thin layer of crystal glass can be added to small screens like watches and smartphones that make them solar powered. Smartphones could gain 15% more battery life during a typical day. The first smartphones using this technology should arrive in 2015. This screen can also work to receive Li-Fi signals and so can the smartphone camera. The cost of these screens per smartphone is between $2 and $3, much cheaper than most new technology.
Near future smartphones might not have a traditional battery as their sole source of power. Instead, they may pull energy from radio, television, cellular or Wi-Fi signals.
In early 2014, smartphones are beginning to use Quad HD (2K) 2560x1440 on 5.5" screens with up to 534 ppi on devices such as the LG G3 which is a significant improvement over Apple's retina display. Quad HD is used in advanced televisions and computer monitors, but with 110 ppi or less on such larger displays.
As of 2014, Wi-Fi networks are much used for smartphones. As Wi-Fi becomes more prevalent and easier to connect to, Wi-Fi phones service will start to take off.
Since 2013, water and dustproofing have made their way into mainstream high end smartphones instead of specialist models with the Sony Xperia Z continuing through the Sony Xperia Z3 and also from other manufacturers with the Samsung Galaxy S5.
One problem with smartphone cameras is still the focus, but LG G3 Beat with Laser Focus has 8 points of focus. To focus what appears in the LCD, touch the object on screen to focus on it and the other positions will be 'bokeh'.
Some smartphones can be categorized as high-end point-and-shoot cameras with large sensor up to 1" with 20 Megapixels and 4K video. Some can store their pictures in proprietary raw image format, but the Android (operating system) 5.0 lollipop serves open source RAW images.
Modular smartphones are projected, in which users can remove and replace parts.

Thursday, April 23, 2015

Air conditioning

Air conditioning




Air conditioning (often referred to as A/C, AC or aircon) is the process of altering the properties of air (primarily temperature and humidity) to more comfortable conditions, typically with the aim of distributing the conditioned air to an occupied space to improve thermal comfort and indoor air quality.

In common use, an air conditioner is a device that lowers the air temperature. The cooling is typically achieved through a refrigeration cycle, but sometimes evaporation or free cooling is used. Air conditioning systems can also be made based on desiccants.

In the most general sense, air conditioning can refer to any form of technology that modifies the condition of air (heating, cooling, (de-)humidification, cleaning, ventilation, or air movement). However, in construction, such a complete system of heating, ventilation, and air conditioning is referred to as HVAC (as opposed to AC).

The basic concept behind air conditioning is said to have been applied in ancient Egypt, where reeds were hung in windows and were moistened with trickling water. The evaporation of water cooled the air blowing through the window. This process also made the air more humid, which can be beneficial in a dry desert climate. In Ancient Rome, water from aqueducts was circulated through the walls of certain houses to cool them. Other techniques in medieval Persia involved the use of cisterns and wind towers to cool buildings during the hot season.

Modern air conditioning emerged from advances in chemistry during the 19th century, and the first large-scale electrical air conditioning was invented and used in 1902 by American inventor Willis Carrier. The introduction of residential air conditioning in the 1920s helped enable the great migration to the Sun Belt in the United States.

Graphics processing unit

Graphics processing unit




A graphics processor unit (GPU), also occasionally called visual processor unit (VPU), is a specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display. GPUs are used in embedded systems, mobile phones, personal computers, workstations, and game consoles. Modern GPUs are very efficient at manipulating computer graphics and image processing, and their highly parallel structure makes them more effective than general-purpose CPUs for algorithms where processing of large blocks of data is done in parallel. In a personal computer, a GPU can be present on a video card, or it can be embedded on the motherboard or—in certain CPUs—on the CPU die.

The term GPU was popularized by Nvidia in 1999, who marketed the GeForce 256 as "the world's first 'GPU', or Graphics Processing Unit, a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that are capable of processing a minimum of 10 million polygons per second". Rival ATI Technologies coined the term visual processing unit or VPU with the release of the Radeon 9700 in 2002.

Computational functions

Modern GPUs use most of their transistors to do calculations related to 3D computer graphics. They were initially used to accelerate the memory-intensive work of texture mapping and rendering polygons, later adding units to accelerate geometric calculations such as the rotation and translation of vertices into different coordinate systems. Recent developments in GPUs include support for programmable shaders which can manipulate vertices and textures with many of the same operations supported by CPUs, oversampling and interpolation techniques to reduce aliasing, and very high-precision color spaces. Because most of these computations involve matrix and vector operations, engineers and scientists have increasingly studied the use of GPUs for non-graphical calculations.

In addition to the 3D hardware, today's GPUs include basic 2D acceleration and framebuffer capabilities (usually with a VGA compatibility mode). Newer cards like AMD/ATI HD5000-HD7000 even lack 2D acceleration; it has to be emulated by 3D hardware.

What is iPhone?



iPhone is a line of smartphones designed and marketed by Apple Inc. It runs Apple's iOS mobile operating system.The first generation iPhone was released on June 29, 2007; the most recent iPhone models are the iPhone 6 and iPhone 6 Plus, which were unveiled at a special event on September 9, 2014.

The user interface is built around the device's multi-touch screen, including a virtual keyboard. The iPhone has Wi-Fi and can connect to many cellular networks, including 1xRTT (represented by a 1x on the status bar) and GPRS (shown as GPRS on the status bar), EDGE (shown as a capital E on the status bar), UMTS and EV-DO (shown as 3G), a faster version of UMTS and 4G (shown as a 4G symbol on the status bar), and LTE (shown as LTE on the status bar). An iPhone can shoot video (though this was not a standard feature until the iPhone 3GS), take photos, play music, send and receive email, browse the web, send texts, GPS navigation, record notes, do mathematical calculations, and receive visual voicemail. Other functions—video games, reference works, social networking, etc.—can be enabled by downloading application programs (‘apps’); as of October 2013, the App Store offered more than one million apps by Apple and third parties and is ranked as the world's second largest mobile software distribution network of its kind (by number of currently available applications).

There are eight generations of iPhone models, each accompanied by one of the eight major releases of iOS. The original 1st-generation iPhone was a GSM phone and established design precedents, such as a button placement that has persisted throughout all releases and a screen size maintained for the next four iterations. The iPhone 3G added 3G cellular network capabilities and A-GPS location. The iPhone 3GS added a faster processor and a higher-resolution camera that could record video at 480p. The iPhone 4 featured a higher-resolution 960×640 "Retina Display", a VGA front-facing camera for video calling and other apps, and a 5-megapixel rear-facing camera with 720p video capture. The iPhone 4S upgrades to an 8-megapixel camera with 1080p video recording, a dual-core A5 processor, and a natural language voice control system called Siri. iPhone 5 features the dual-core A6 processor, increases the size of the Retina display to 4 inches, introduces LTE support and replaces the 30-pin connector with an all-digital Lightning connector. The iPhone 5S features the dual-core 64-bit A7 processor, an updated camera with a larger aperture and dual-LED flash, and the Touch ID fingerprint scanner, integrated into the home button. The iPhone 5C features the same A6 chip as the iPhone 5, along with a new backside-illuminated FaceTime camera and a new casing made of polycarbonate. The iPhone 6 and iPhone 6 Plus further increased screen size, measuring at 4.7 inches and 5.5 inches, respectively. In addition, they also feature a new A8 chip and M8 motion coprocessor. As of 2013, the iPhone 3GS had the longest production run, 1,181 days; followed by the iPhone 4, produced for 1,174 days.

The resounding sales of the iPhone, at the time, have been credited with reshaping the smartphone industry and helping make Apple one of the world's most valuable publicly traded companies in 2011–12.

In late 2014, JP Morgan estimated "iPhone percentage of the worldwide smartphone install base has been around 15% since late 2012" being far behind the dominant Android-based smartphones. In a few mature markets countries such as; Japan, the iPhone has a majority, an exception to Android's dominance, and Australia where Android is rapidly approaching parity. In March 2014, sales of the iPhone brand had reached 500 million devices. In the last quarter of 2014, there were 74.5 million iPhones sold, a new record, compared to 51.0 million in the last quarter of 2013. Tim Cook revealed at the Apple Watch conference on March 9, 2015 that Apple had sold a total of 700 million iPhones to date.

Tuesday, April 21, 2015

Printer

Printer

In computing, a printer is a peripheral which makes a persistent human readable representation of graphics or text on paper or similar physical media. The two most common printer mechanisms are black and white laser printers used for common documents, and color inkjet printers which can produce high quality photograph quality output.

The world's first computer printer was a 19th-century mechanically driven apparatus invented by Charles Babbage for his difference engine. This system used a series of metal rods with characters printed on them and stuck a roll of paper against the rods to print the characters. The first commercial printers generally used mechanisms from electric typewriters and Teletype machines, which operated in a similar fashion. The demand for higher speed led to the development of new systems specifically for computer use. Among the systems widely used through the 1980s were daisy wheel systems similar to typewriters, line printers that produced similar output but at much higher speed, and dot matrix systems that could mix text and graphics but produced relatively low-quality output. The plotter was used for those requiring high quality line art like blueprints.

The introduction of the low-cost laser printer in 1984 with the first HP LaserJet, and the addition of PostScript in next year's Apple LaserWriter, set off a revolution in printing known as desktop publishing. Laser printers using PostScript mixed text and graphics, like dot-matrix printers, but at quality levels formerly available only from commercial typesetting systems. By 1990, most simple printing tasks like fliers and brochures were now created on personal computers and then laser printed; expensive offset printing systems were being dumped as scrap. The HP Deskjet of 1988 offered the same advantages as laser printer in terms of flexibility, but produced somewhat lower quality output (depending on the paper) from much less expensive mechanisms. Inkjet systems rapidly displaced dot matrix and daisy wheel printers from the market. By the 2000s high-quality printers of this sort had fallen under the $100 price point and became commonplace.

The rapid update of internet email through the 1990s and into the 2000s has largely displaced the need for printing as a means of moving documents, and a wide variety of reliable storage systems means that a "physical backup" is of little benefit today. Even the desire for printed output for "offline reading" while on mass transit or aircraft has been displaced by e-book readers and tablet computers. Today, traditional printers are being used more for special purposes, like printing photographs or artwork, and are no longer a must-have peripheral.

Starting around 2010, 3D printing has become an area of intense interest, allowing the creation of physical objects with the same sort of effort as an early laser printer required to produce a brochure. These devices are in their earliest stages of development and have not yet become commonplace.

Types of printers

Personal printers are primarily designed to support individual users, and may be connected to only a single computer. These printers are designed for low-volume, short-turnaround print jobs, requiring minimal setup time to produce a hard copy of a given document. However, they are generally slow devices ranging from 6 to around 25 pages per minute (ppm), and the cost per page is relatively high. However, this is offset by the on-demand convenience. Some printers can print documents stored on memory cards or from digital cameras and scanners.

Networked or shared printers are "designed for high-volume, high-speed printing." They are usually shared by many users on a network and can print at speeds of 45 to around 100 ppm. The Xerox 9700 could achieve 120 ppm.
A virtual printer is a piece of computer software whose user interface and API resembles that of a printer driver, but which is not connected with a physical computer printer.

A 3D printer is a device for making a three-dimensional object from a 3D model or other electronic data source through additive processes in which successive layers of material ( including plastics, metals, food, cement, wood, and other materials) are laid down under computer control. It is called a printer by analogy with an inkjet printer which produces a two-dimensional document by a similar process of depositing a layer of ink on paper.

Technology

The choice of print technology has a great effect on the cost of the printer and cost of operation, speed, quality and permanence of documents, and noise. Some printer technologies don't work with certain types of physical media, such as carbon paper or transparencies.

A second aspect of printer technology that is often forgotten is resistance to alteration: liquid ink, such as from an inkjet head or fabric ribbon, becomes absorbed by the paper fibers, so documents printed with liquid ink are more difficult to alter than documents printed with toner or solid inks, which do not penetrate below the paper surface.

Cheques can be printed with liquid ink or on special cheque paper with toner anchorage so that alterations may be detected. The machine-readable lower portion of a cheque must be printed using MICR toner or ink. Banks and other clearing houses employ automation equipment that relies on the magnetic flux from these specially printed characters to function properly.

DVD player



A DVD player is a device that plays discs produced under both the DVD-Video and DVD-Audio technical standards, two different and incompatible standards. Some DVD players will also play audio CDs. DVD players are connected to a television to watch the DVD content, which could be a movie, a recorded TV show, or other content.

The first DVD player was created by Tatung Company in Taiwan in collaboration with Pacific Digital Company from the United States in 1994. Some manufacturers originally announced that DVD players would be available as early as the middle of 1996. These predictions were too optimistic. Delivery was initially held up for "political" reasons of copy protection demanded by movie studios, but was later delayed by lack of movie titles. The first players appeared in Japan in November, 1996, followed by U.S. players in March, 1997, with distribution limited to only 7 major cities for the first 6 months.

Players slowly trickled into other regions around the world. Prices for the first players in 1997 were $1000 and up. By the end of 2000, players were available for under $100 at discount retailers. In 2003 players became available for under $50. Six years after the initial launch, close to one thousand models of DVD players were available from over a hundred consumer electronics manufacturers.

Fujitsu released the first DVD-ROM-equipped computer on Nov. 6 in Japan. Toshiba released a DVD-ROM-equipped computer and a DVD-ROM drive in Japan in early 1997 (moved back from December which was moved back from November). DVD-ROM drives from Toshiba, Pioneer, Panasonic, Hitachi, and Sony began appearing in sample quantities as early as January 1997, but none were available before May. The first PC upgrade kits (a combination of DVD-ROM drive and hardware decoder card) became available from Creative Labs, Hi-Val, and Diamond Multimedia in April and May of 1997. In 2014, every major PC manufacturer has models that include DVD-ROM drives.

The first DVD-Audio players were released in Japan by Pioneer in late 1999, but they did not play copy-protected discs. Matsushita (under the Panasonic and Technics labels) first released full-fledged players in July 2000 for $700 to $1,200. DVD-Audio players are now also made by Aiwa, Denon, JVC, Kenwood, Madrigal, Marantz, Nakamichi, Onkyo, Toshiba, Yamaha, and others. Sony released the first SACD players in May 1999 for $5,000. Pioneer's first DVD-Audio players released in late 1999 also played SACD. SACD players are now also made by Accuphase, Aiwa, Denon, Kenwood, Marantz, Philips, Sharp, and others.

Sunday, April 19, 2015

The History about Gadget




The origins of the word "gadget" trace back to the 19th century. According to the Oxford English Dictionary, there is anecdotal (not necessarily true) evidence for the use of "gadget" as a placeholder name for a technical item whose precise name one can't remember since the 1850s; with Robert Brown's 1886 book Spunyarn and Spindrift, A sailor boy’s log of a voyage out and home in a China tea-clipper containing the earliest known usage in print. The etymology of the word is disputed.

A widely circulated story holds that the word gadget was "invented" when Gaget, Gauthier & Cie, the company behind the repoussé construction of the Statue of Liberty (1886), made a small-scale version of the monument and named it after their firm; however this contradicts the evidence that the word was already used before in nautical circles, and the fact that it did not become popular, at least in the USA, until after World War I. Other sources cite a derivation from the French gâchette which has been applied to various pieces of a firing mechanism, or the French gagée, a small tool or accessory.

The October 1918 issue of Notes and Queries contains a multi-article entry on the word "gadget" (12 S. iv. 187). H. Tapley-Soper of The City Library, Exeter, writes:

A discussion arose at the Plymouth meeting of the Devonshire Association in 1916 when it was suggested that this word should be recorded in the list of local verbal provincialisms. Several members dissented from its inclusion on the ground that it is in common use throughout the country; and a naval officer who was present said that it has for years been a popular expression in the service for a tool or implement, the exact name of which is unknown or has for the moment been forgotten. I have also frequently heard it applied by motor-cycle friends to the collection of fitments to be seen on motor cycles. 'His handle-bars are smothered in gadgets' refers to such things as speedometers, mirrors, levers, badges, mascots, &c., attached to the steering handles. The 'jigger' or short-rest used in billiards is also often called a 'gadget'; and the name has been applied by local platelayers to the 'gauge' used to test the accuracy of their work. In fact, to borrow from present-day Army slang, 'gadget' is applied to 'any old thing.'

The usage of the term in military parlance extended beyond the navy. In the book "Above the Battle" by Vivian Drake, published in 1918 by D. Appleton & Co., of New York and London, being the memoirs of a pilot in the British Royal Flying Corps, there is the following passage: "Our ennui was occasionally relieved by new gadgets -- "gadget" is the Flying Corps slang for invention! Some gadgets were good, some comic and some extraordinary."

By the second half of the twentieth century, the term "gadget" had taken on the connotations of compactness and mobility. In the 1965 essay "The Great Gizmo" (a term used interchangeably with "gadget" throughout the essay), the architectural and design critic Reyner Banham defines the item as:

A characteristic class of US products––perhaps the most characteristic––is a small self-contained unit of high performance in relation to its size and cost, whose function is to transform some undifferentiated set of circumstances to a condition nearer human desires. The minimum of skills is required in its installation and use, and it is independent of any physical or social infrastructure beyond that by which it may be ordered from catalogue and delivered to its prospective user. A class of servants to human needs, these clip-on devices, these portable gadgets, have coloured American thought and action far more deeply––I suspect––than is commonly understood.

Saturday, April 18, 2015

Motherboard


A motherboard (sometimes alternatively known as the mainboard, system board, planar board or logic board, or colloquially, a mobo) is the main printed circuit board (PCB) found in computers and other expandable systems. It holds many of the crucial electronic components of the system, such as the central processing unit (CPU) and memory, and provides connectors for other peripherals. Unlike a backplane, a motherboard contains significant sub-systems such as the processor and other components.

Motherboard specifically refers to a PCB with expansion capability and as the name suggests, this board is the "mother" of all components attached to it, which often include sound cards, video cards, network cards, hard drives, or other forms of persistent storage; TV tuner cards, cards providing extra USB or FireWire slots and a variety of other custom components (the term mainboard is applied to devices with a single board and no additional expansions or capability, such as controlling boards in televisions, washing machines and other embedded systems).

CPU sockets

A CPU socket (central processing unit) or slot is an electrical component that attaches to a Printed Circuit Board (PCB) and is designed to house a CPU (also called a microprocessor). It is a special type of integrated circuit socket designed for very high pin counts. A CPU socket provides many functions, including a physical structure to support the CPU, support for a heat sink, facilitating replacement (as well as reducing cost), and most importantly, forming an electrical interface both with the CPU and the PCB. CPU sockets on the motherboard can most often be found in most desktop and server computers (laptops typically use surface mount CPUs), particularly those based on the Intel x86 architecture. A CPU socket type and motherboard chipset must support the CPU series and speed.

Integrated peripherals

Block diagram of a modern motherboard, which supports many on-board peripheral functions as well as several expansion slots
With the steadily declining costs and size of integrated circuits, it is now possible to include support for many peripherals on the motherboard. By combining many functions on one PCB, the physical size and total cost of the system may be reduced; highly integrated motherboards are thus especially popular in small form factor and budget computers.

Disk controllers for a floppy disk drive, up to 2 PATA drives, and up to 6 SATA drives (including RAID 0/1 support)
integrated graphics controller supporting 2D and 3D graphics, with VGA and TV output
integrated sound card supporting 8-channel (7.1) audio and S/PDIF output
Fast Ethernet network controller for 10/100 Mbit networking
USB 2.0 controller supporting up to 12 USB ports
IrDA controller for infrared data communication (e.g. with an IrDA-enabled cellular phone or printer)
Temperature, voltage, and fan-speed sensors that allow software to monitor the health of computer components.
Peripheral card slots
A typical motherboard of 2012 will have a different number of connections depending on its standard.

A standard ATX motherboard will typically have two or three PCI-E 16x connection for a graphics card, one or two legacy PCI slots for various expansion cards, and one or two PCI-E 1x (which has superseded PCI). A standard EATX motherboard will have two to four PCI-Express 16x connection for graphics cards, and a varying number of PCI and PCI-E 1x slots. It can sometimes also have a PCI-E 4x slot (will vary between brands and models).

Some motherboards have two or more PCI-E 16x slots, to allow more than 2 monitors without special hardware, or use a special graphics technology called SLI (for Nvidia) and Crossfire (for ATI). These allow 2 to 4 graphics cards to be linked together, to allow better performance in intensive graphical computing tasks, such as gaming, video editing, etc.

Wednesday, April 15, 2015

Freezer


Freezer



Freezer units are used in households and in industry and commerce. Food stored at or below −18 °C (0 °F) is safe indefinitely. Most household freezers maintain temperatures from −23 to −18 °C (−9 to 0 °F), although some freezer-only units can achieve −34 °C (−29 °F) and lower. Refrigerators generally do not achieve lower than −23 °C (−9 °F), since the same coolant loop serves both compartments: Lowering the freezer compartment temperature excessively causes difficulties in maintaining above-freezing temperature in the refrigerator compartment. Domestic freezers can be included as a separate compartment in a refrigerator, or can be a separate appliance. Domestic freezers are generally upright units resembling refrigerators or chests (upright units laid on their backs). Many modern upright freezers come with an ice dispenser built into their door. Some upscale models include thermostat displays and controls, and flat-screen televisions have even been incorporated.

Effect on lifestyle

The refrigerator allows the modern family to keep food fresh for longer than before. The most notable improvement is for meat and other highly perishable wares, which needed to be refined to gain anything resembling shelf life. (On the other hand, refrigerators and freezers can also be stocked with processed, quick-cook foods that are less healthy.) Refrigeration in transit makes it possible to enjoy foodstuffs from distant places.

Dairy products, meats, fish, poultry and vegetables can be kept refrigerated in the same space within the kitchen (although raw meat should be kept separate from other foodstuffs for reasons of hygiene).

Freezers allow people to buy food in bulk and eat it at leisure, and bulk purchases save money. Ice cream, a popular commodity of the 20th century, could previously only be obtained by traveling to where the product was made and eating it on the spot. Now it is a common food item. Ice on demand not only adds to the enjoyment of cold drinks, but is useful for first-aid, and for cold packs that can be kept frozen for picnics or in case of emergency.

Temperature zones and ratings

File:Theater commercial, electric refrigerator, 1926.ogg
Commercial for electric refrigerators in Pittsburgh, Pennsylvania, 1926
Some refrigerators are now divided into four zones to store different types of food:

−18 °C (0 °F) (freezer)
0 °C (32 °F) (meat zone)
5 °C (41 °F) (cooling zone)
10 °C (50 °F) (crisper)
The capacity of a refrigerator is measured in either litres or cubic feet. Typically the volume of a combined refrigerator-freezer is split to 100 litres (3.53 cubic feet) for the freezer and 140 litres (4.94 cubic feet) for the refrigerator, although these values are highly variable.

Temperature settings for refrigerator and freezer compartments are often given arbitrary numbers by manufacturers (for example, 1 through 9, warmest to coldest), but generally 3 to 5 °C (37 to 41 °F) is ideal for the refrigerator compartment and −18 °C (0 °F) for the freezer. Some refrigerators must be within certain external temperature parameters to run properly. This can be an issue when placing units in an unfinished area, such as a garage. European freezers, and refrigerators with a freezer compartment, have a four star rating system to grade freezers.

[∗]  : min temperature = −6 °C (21 °F). Maximum storage time for (pre-frozen) food is 1 week
[∗∗]  : min temperature = −12 °C (10 °F). Maximum storage time for (pre-frozen) food is 1 month
[∗∗∗]  : min temperature = −18 °C (0 °F). Maximum storage time for (pre-frozen) food is between 3 and 12 months depending on type (meat, vegetables, fish, etc.)
[∗∗∗∗] : min temperature = −18 °C (0 °F). Maximum storage time for pre-frozen or frozen-from-fresh food is between 3 and 12 months
Although both the three and four star ratings specify the same storage times and same minimum temperature of −18 °C (0 °F), only a four star freezer is intended for freezing fresh food, and may include a "fast freeze" function (runs the compressor continually, down to as low as −26 °C (−15 °F)) to facilitate this. Three (or fewer) stars are used for frozen food compartments that are only suitable for storing frozen food; introducing fresh food into such a compartment is likely to result in unacceptable temperature rises. This difference in categorisation is shown in the design of the 4-star logo, where the "standard" three stars are displayed in a box using "positive" colours, denoting the same normal operation as a 3-star freezer, and the fourth star showing the additional fresh food/fast freeze function is prefixed to the box in "negative" colours or with other distinct formatting.

Most European refrigerators include a moist cold refrigerator section (which does require (automatic) defrosting at irregular intervals) and a (rarely frost free) freezer section.

Macintosh

Macintosh



The Macintosh  branded as Mac since 1998) is a series of personal computers (PCs) designed, developed, and marketed by Apple Inc. Steve Jobs introduced the original Macintosh computer on January 24, 1984. This was the first mass-market personal computer featuring an integral graphical user interface and mouse. This first model was later renamed to "Macintosh 128k" for uniqueness amongst a populous family of subsequently updated models which are also based on Apple's same proprietary architecture. Since 1998, Apple has largely phased out the Macintosh name in favor of "Mac", though the product family has been nicknamed "Mac" or "the Mac" since the development of the first model.



The Macintosh, however, was expensive, which hindered its ability to be competitive in a market already dominated by the Commodore 64 for consumers, as well as the IBM Personal Computer and its accompanying clone market for businesses. Macintosh systems still found success in education and desktop publishing and kept Apple as the second-largest PC manufacturer for the next decade. In the 1990s, improvements in the rival Wintel platform, notably with the introduction of Windows 3.0, gradually took market share from the more expensive Macintosh systems. The performance advantage of 68000-based Macintosh systems was eroded by Intel's Pentium, and in 1994 Apple was relegated to third place as Compaq became the top PC manufacturer. Even after a transition to the superior PowerPC-based Power Macintosh line in 1994, the falling prices of commodity PC components and the release of Windows 95 saw the Macintosh user base decline.



In 1998, after the return of Steve Jobs, Apple consolidated its multiple consumer-level desktop models into the all-in-one iMac G3, which became a commercial success and revitalized the brand. Since their transition to Intel processors in 2006, the complete lineup is entirely based on said processors and associated systems. Its current lineup comprises three desktops (the all-in-one iMac, entry-level Mac mini, and the Mac Pro tower graphics workstation), and four laptops (the Macbook, MacBook Air, MacBook Pro, and MacBook Pro with Retina display). Its Xserve server was discontinued in 2011 in favor of the Mac Mini and Mac Pro.



Production of the Mac is based on a partial vertical integration model. While Apple designs its own hardware and creates its own operating system that is pre-installed on all Mac computers, Apple sources components, such as microprocessors, RAM and LCD panels from other vendors. Apple also relies on contract manufacturers like Foxconn and Flextronics to build most of its products. This approach differs from most IBM PC compatibles, where multiple sellers create and integrate hardware intended to run another company's operating system.



Apple also develops the operating system for the Mac, currently OS X version 10.10 "Yosemite". Macs are currently capable of running non-Apple operating systems such as Linux, OpenBSD, and Microsoft Windows with the aid of Boot Camp or third-party software. Apple does not license OS X for use on non-Apple computers, though it did license previous versions of Mac OS through their Macintosh clone program from 1995 to 1997.