Tuesday, April 28, 2015

What is Android?

What is Android?
Android is a mobile operating system (OS) based on the Linux kernel and currently developed by Google. With a user interface based on direct manipulation, Android is designed primarily for touchscreen mobile devices such as smartphones and tablet computers, with specialized user interfaces for televisions (Android TV), cars (Android Auto), and wrist watches (Android Wear). The OS uses touch inputs that loosely correspond to real-world actions, like swiping, tapping, pinching, and reverse pinching to manipulate on-screen objects, and a virtual keyboard. Despite being primarily designed for touchscreen input, it also has been used in game consoles, digital cameras, regular PCs (e.g. the HP Slate 21) and other electronics.

As of July 2013 the Google Play store has had over one million Android applications ("apps") published, and over 50 billion applications downloaded. A developer survey conducted in April–May 2013 found that 71% of mobile developers develop for Android. At Google I/O 2014, the company revealed that there were over one billion active monthly Android users, up from 538 million in June 2013. As of 2015, Android has the largest installed base of all general-purpose operating systems.

Android's source code is released by Google under open source licenses, although most Android devices ultimately ship with a combination of open source and proprietary software, including proprietary software developed and licensed by Google. Initially developed by Android, Inc., which Google backed financially and later bought in 2005, Android was unveiled in 2007 along with the founding of the Open Handset Alliance—​a consortium of hardware, software, and telecommunication companies devoted to advancing open standards for mobile devices.

Android is popular with technology companies which require a ready-made, low-cost and customizable operating system for high-tech devices. Android's open nature has encouraged a large community of developers and enthusiasts to use the open-source code as a foundation for community-driven projects, which add new features for advanced users or bring Android to devices which were officially released running other operating systems. The operating system's success has made it a target for patent litigation as part of the so-called "smartphone wars" between technology companies.

Saturday, April 25, 2015

What is Smartphone?




A smartphone (or smart phone) is a mobile phone with an advanced operating system.Smartphones typically include the features of a phone with those of other popular mobile devices, such as personal digital assistant, media player and GPS navigation unit. Most have a touchscreen interface and can run third-party apps, and are camera phones. Later smartphones add broadband internet web browsing, Wi-Fi, motion sensors and mobile payment mechanisms.

In February 2014, 93% of mobile developers were targeting smartphones first for app development.In 2014, sales of smartphones worldwide topped 1.2 billion, which is up 28% from 2013.

Early years

The first caller identification receiver
Devices that combined telephony and computing were first conceptualized by Theodore G. Paraskevakos in 1971 and patented in 1974, and were offered for sale beginning in 1993. He was the first to introduce the concepts of intelligence, data processing and visual display screens into telephones which gave rise to the "smartphone." In 1971, Paraskevakos, working with Boeing in Huntsville, Alabama, demonstrated a transmitter and receiver that provided additional ways to communicate with remote equipment, however it did not yet have general purpose PDA applications in a wireless device typical of smartphones. They were installed at Peoples' Telephone Company in Leesburg, Alabama and were demonstrated to several telephone companies. The original and historic working models are still in the possession of Paraskevakos.

Forerunners

IBM Simon and charging base (1993))
The first mobile phone to incorporate PDA features was an IBM prototype developed in 1992 and demonstrated that year at the COMDEX computer industry trade show. A refined version of the product was marketed to consumers in 1994 by BellSouth under the name Simon Personal Communicator. The Simon was the first cellular device that can be properly referred to as a "smartphone", although it wasn't called a smartphone in 1994. In addition to its ability to make and receive cellular phone calls, Simon was also able to send and receive faxes and e-mails and included several other apps like address book, calendar, appointment scheduler, calculator, world time clock, and note pad through its touch screen display. Simon is the first smartphone to be incorporated with the features of a PDA

The term "smart phone" appeared in print in 1995, for describing AT&T's "PhoneWriter(TM) Communicator" as a "smart phone".

iPhone & Android

In 2007, Apple Inc. introduced the iPhone, one of the first mobile phones to use a multi-touch interface. The iPhone was notable for its use of a large touchscreen for direct finger input as its main means of interaction, instead of a stylus, keyboard, or keypad typical for smartphones at the time. 2008 saw the release of the first phone to use Android called the HTC Dream (also known as the T-Mobile G1).Android is an open-source platform founded by Andy Rubin and now owned by Google.Although Android's adoption was relatively slow at first, it started to gain widespread popularity in 2010, and now dominates the market.

These new platforms led to the decline of earlier ones. When Microsoft, for instance, started a new OS from scratch, in the form of Windows Phone, Nokia abandoned Symbian and partnered with MS to use Windows Phone on its smartphones. Windows Phone then became the third-most-popular OS. Palm was bought by Hewlett-Packard, turned into webOS which became Open webOS and later sold to LG Electronics. Research in Motion also made a new system from scratch, BlackBerry 10.

The capacitive touchscreen also had a knock-on effect on smartphone form factors. Before 2007 it was common for devices to have a physical numeric keypad or physical QWERTY keyboard in either a candybar or sliding form factor. However, by 2010, there were no top-tier smartphones with physical numeric keypads.

The future

In 2013, the Fairphone company launched its first "socially ethical" smartphone at the London Design Festival to address concerns regarding the sourcing of materials in the manufacturing.
In late 2013, QSAlpha commenced production of a smartphone designed entirely around security, encryption and identity protection
In December 2013, the world's first curved-OLED technology smartphones were introduced to the retail market with the sale of the Samsung Galaxy Round and LG G Flex models. Samsung phones with more bends and folds in the screens are expected this year.
Foldable OLED smartphones could be as much as a decade away because of the cost of producing them. There is a relatively high failure rate when producing these screens. As little as a speck of dust can ruin a screen during production. Creating a battery that can be folded is another hurdle.
A clear thin layer of crystal glass can be added to small screens like watches and smartphones that make them solar powered. Smartphones could gain 15% more battery life during a typical day. The first smartphones using this technology should arrive in 2015. This screen can also work to receive Li-Fi signals and so can the smartphone camera. The cost of these screens per smartphone is between $2 and $3, much cheaper than most new technology.
Near future smartphones might not have a traditional battery as their sole source of power. Instead, they may pull energy from radio, television, cellular or Wi-Fi signals.
In early 2014, smartphones are beginning to use Quad HD (2K) 2560x1440 on 5.5" screens with up to 534 ppi on devices such as the LG G3 which is a significant improvement over Apple's retina display. Quad HD is used in advanced televisions and computer monitors, but with 110 ppi or less on such larger displays.
As of 2014, Wi-Fi networks are much used for smartphones. As Wi-Fi becomes more prevalent and easier to connect to, Wi-Fi phones service will start to take off.
Since 2013, water and dustproofing have made their way into mainstream high end smartphones instead of specialist models with the Sony Xperia Z continuing through the Sony Xperia Z3 and also from other manufacturers with the Samsung Galaxy S5.
One problem with smartphone cameras is still the focus, but LG G3 Beat with Laser Focus has 8 points of focus. To focus what appears in the LCD, touch the object on screen to focus on it and the other positions will be 'bokeh'.
Some smartphones can be categorized as high-end point-and-shoot cameras with large sensor up to 1" with 20 Megapixels and 4K video. Some can store their pictures in proprietary raw image format, but the Android (operating system) 5.0 lollipop serves open source RAW images.
Modular smartphones are projected, in which users can remove and replace parts.

Thursday, April 23, 2015

Air conditioning

Air conditioning




Air conditioning (often referred to as A/C, AC or aircon) is the process of altering the properties of air (primarily temperature and humidity) to more comfortable conditions, typically with the aim of distributing the conditioned air to an occupied space to improve thermal comfort and indoor air quality.

In common use, an air conditioner is a device that lowers the air temperature. The cooling is typically achieved through a refrigeration cycle, but sometimes evaporation or free cooling is used. Air conditioning systems can also be made based on desiccants.

In the most general sense, air conditioning can refer to any form of technology that modifies the condition of air (heating, cooling, (de-)humidification, cleaning, ventilation, or air movement). However, in construction, such a complete system of heating, ventilation, and air conditioning is referred to as HVAC (as opposed to AC).

The basic concept behind air conditioning is said to have been applied in ancient Egypt, where reeds were hung in windows and were moistened with trickling water. The evaporation of water cooled the air blowing through the window. This process also made the air more humid, which can be beneficial in a dry desert climate. In Ancient Rome, water from aqueducts was circulated through the walls of certain houses to cool them. Other techniques in medieval Persia involved the use of cisterns and wind towers to cool buildings during the hot season.

Modern air conditioning emerged from advances in chemistry during the 19th century, and the first large-scale electrical air conditioning was invented and used in 1902 by American inventor Willis Carrier. The introduction of residential air conditioning in the 1920s helped enable the great migration to the Sun Belt in the United States.

Graphics processing unit

Graphics processing unit




A graphics processor unit (GPU), also occasionally called visual processor unit (VPU), is a specialized electronic circuit designed to rapidly manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display. GPUs are used in embedded systems, mobile phones, personal computers, workstations, and game consoles. Modern GPUs are very efficient at manipulating computer graphics and image processing, and their highly parallel structure makes them more effective than general-purpose CPUs for algorithms where processing of large blocks of data is done in parallel. In a personal computer, a GPU can be present on a video card, or it can be embedded on the motherboard or—in certain CPUs—on the CPU die.

The term GPU was popularized by Nvidia in 1999, who marketed the GeForce 256 as "the world's first 'GPU', or Graphics Processing Unit, a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that are capable of processing a minimum of 10 million polygons per second". Rival ATI Technologies coined the term visual processing unit or VPU with the release of the Radeon 9700 in 2002.

Computational functions

Modern GPUs use most of their transistors to do calculations related to 3D computer graphics. They were initially used to accelerate the memory-intensive work of texture mapping and rendering polygons, later adding units to accelerate geometric calculations such as the rotation and translation of vertices into different coordinate systems. Recent developments in GPUs include support for programmable shaders which can manipulate vertices and textures with many of the same operations supported by CPUs, oversampling and interpolation techniques to reduce aliasing, and very high-precision color spaces. Because most of these computations involve matrix and vector operations, engineers and scientists have increasingly studied the use of GPUs for non-graphical calculations.

In addition to the 3D hardware, today's GPUs include basic 2D acceleration and framebuffer capabilities (usually with a VGA compatibility mode). Newer cards like AMD/ATI HD5000-HD7000 even lack 2D acceleration; it has to be emulated by 3D hardware.

What is iPhone?



iPhone is a line of smartphones designed and marketed by Apple Inc. It runs Apple's iOS mobile operating system.The first generation iPhone was released on June 29, 2007; the most recent iPhone models are the iPhone 6 and iPhone 6 Plus, which were unveiled at a special event on September 9, 2014.

The user interface is built around the device's multi-touch screen, including a virtual keyboard. The iPhone has Wi-Fi and can connect to many cellular networks, including 1xRTT (represented by a 1x on the status bar) and GPRS (shown as GPRS on the status bar), EDGE (shown as a capital E on the status bar), UMTS and EV-DO (shown as 3G), a faster version of UMTS and 4G (shown as a 4G symbol on the status bar), and LTE (shown as LTE on the status bar). An iPhone can shoot video (though this was not a standard feature until the iPhone 3GS), take photos, play music, send and receive email, browse the web, send texts, GPS navigation, record notes, do mathematical calculations, and receive visual voicemail. Other functions—video games, reference works, social networking, etc.—can be enabled by downloading application programs (‘apps’); as of October 2013, the App Store offered more than one million apps by Apple and third parties and is ranked as the world's second largest mobile software distribution network of its kind (by number of currently available applications).

There are eight generations of iPhone models, each accompanied by one of the eight major releases of iOS. The original 1st-generation iPhone was a GSM phone and established design precedents, such as a button placement that has persisted throughout all releases and a screen size maintained for the next four iterations. The iPhone 3G added 3G cellular network capabilities and A-GPS location. The iPhone 3GS added a faster processor and a higher-resolution camera that could record video at 480p. The iPhone 4 featured a higher-resolution 960×640 "Retina Display", a VGA front-facing camera for video calling and other apps, and a 5-megapixel rear-facing camera with 720p video capture. The iPhone 4S upgrades to an 8-megapixel camera with 1080p video recording, a dual-core A5 processor, and a natural language voice control system called Siri. iPhone 5 features the dual-core A6 processor, increases the size of the Retina display to 4 inches, introduces LTE support and replaces the 30-pin connector with an all-digital Lightning connector. The iPhone 5S features the dual-core 64-bit A7 processor, an updated camera with a larger aperture and dual-LED flash, and the Touch ID fingerprint scanner, integrated into the home button. The iPhone 5C features the same A6 chip as the iPhone 5, along with a new backside-illuminated FaceTime camera and a new casing made of polycarbonate. The iPhone 6 and iPhone 6 Plus further increased screen size, measuring at 4.7 inches and 5.5 inches, respectively. In addition, they also feature a new A8 chip and M8 motion coprocessor. As of 2013, the iPhone 3GS had the longest production run, 1,181 days; followed by the iPhone 4, produced for 1,174 days.

The resounding sales of the iPhone, at the time, have been credited with reshaping the smartphone industry and helping make Apple one of the world's most valuable publicly traded companies in 2011–12.

In late 2014, JP Morgan estimated "iPhone percentage of the worldwide smartphone install base has been around 15% since late 2012" being far behind the dominant Android-based smartphones. In a few mature markets countries such as; Japan, the iPhone has a majority, an exception to Android's dominance, and Australia where Android is rapidly approaching parity. In March 2014, sales of the iPhone brand had reached 500 million devices. In the last quarter of 2014, there were 74.5 million iPhones sold, a new record, compared to 51.0 million in the last quarter of 2013. Tim Cook revealed at the Apple Watch conference on March 9, 2015 that Apple had sold a total of 700 million iPhones to date.

Tuesday, April 21, 2015

Printer

Printer

In computing, a printer is a peripheral which makes a persistent human readable representation of graphics or text on paper or similar physical media. The two most common printer mechanisms are black and white laser printers used for common documents, and color inkjet printers which can produce high quality photograph quality output.

The world's first computer printer was a 19th-century mechanically driven apparatus invented by Charles Babbage for his difference engine. This system used a series of metal rods with characters printed on them and stuck a roll of paper against the rods to print the characters. The first commercial printers generally used mechanisms from electric typewriters and Teletype machines, which operated in a similar fashion. The demand for higher speed led to the development of new systems specifically for computer use. Among the systems widely used through the 1980s were daisy wheel systems similar to typewriters, line printers that produced similar output but at much higher speed, and dot matrix systems that could mix text and graphics but produced relatively low-quality output. The plotter was used for those requiring high quality line art like blueprints.

The introduction of the low-cost laser printer in 1984 with the first HP LaserJet, and the addition of PostScript in next year's Apple LaserWriter, set off a revolution in printing known as desktop publishing. Laser printers using PostScript mixed text and graphics, like dot-matrix printers, but at quality levels formerly available only from commercial typesetting systems. By 1990, most simple printing tasks like fliers and brochures were now created on personal computers and then laser printed; expensive offset printing systems were being dumped as scrap. The HP Deskjet of 1988 offered the same advantages as laser printer in terms of flexibility, but produced somewhat lower quality output (depending on the paper) from much less expensive mechanisms. Inkjet systems rapidly displaced dot matrix and daisy wheel printers from the market. By the 2000s high-quality printers of this sort had fallen under the $100 price point and became commonplace.

The rapid update of internet email through the 1990s and into the 2000s has largely displaced the need for printing as a means of moving documents, and a wide variety of reliable storage systems means that a "physical backup" is of little benefit today. Even the desire for printed output for "offline reading" while on mass transit or aircraft has been displaced by e-book readers and tablet computers. Today, traditional printers are being used more for special purposes, like printing photographs or artwork, and are no longer a must-have peripheral.

Starting around 2010, 3D printing has become an area of intense interest, allowing the creation of physical objects with the same sort of effort as an early laser printer required to produce a brochure. These devices are in their earliest stages of development and have not yet become commonplace.

Types of printers

Personal printers are primarily designed to support individual users, and may be connected to only a single computer. These printers are designed for low-volume, short-turnaround print jobs, requiring minimal setup time to produce a hard copy of a given document. However, they are generally slow devices ranging from 6 to around 25 pages per minute (ppm), and the cost per page is relatively high. However, this is offset by the on-demand convenience. Some printers can print documents stored on memory cards or from digital cameras and scanners.

Networked or shared printers are "designed for high-volume, high-speed printing." They are usually shared by many users on a network and can print at speeds of 45 to around 100 ppm. The Xerox 9700 could achieve 120 ppm.
A virtual printer is a piece of computer software whose user interface and API resembles that of a printer driver, but which is not connected with a physical computer printer.

A 3D printer is a device for making a three-dimensional object from a 3D model or other electronic data source through additive processes in which successive layers of material ( including plastics, metals, food, cement, wood, and other materials) are laid down under computer control. It is called a printer by analogy with an inkjet printer which produces a two-dimensional document by a similar process of depositing a layer of ink on paper.

Technology

The choice of print technology has a great effect on the cost of the printer and cost of operation, speed, quality and permanence of documents, and noise. Some printer technologies don't work with certain types of physical media, such as carbon paper or transparencies.

A second aspect of printer technology that is often forgotten is resistance to alteration: liquid ink, such as from an inkjet head or fabric ribbon, becomes absorbed by the paper fibers, so documents printed with liquid ink are more difficult to alter than documents printed with toner or solid inks, which do not penetrate below the paper surface.

Cheques can be printed with liquid ink or on special cheque paper with toner anchorage so that alterations may be detected. The machine-readable lower portion of a cheque must be printed using MICR toner or ink. Banks and other clearing houses employ automation equipment that relies on the magnetic flux from these specially printed characters to function properly.

DVD player



A DVD player is a device that plays discs produced under both the DVD-Video and DVD-Audio technical standards, two different and incompatible standards. Some DVD players will also play audio CDs. DVD players are connected to a television to watch the DVD content, which could be a movie, a recorded TV show, or other content.

The first DVD player was created by Tatung Company in Taiwan in collaboration with Pacific Digital Company from the United States in 1994. Some manufacturers originally announced that DVD players would be available as early as the middle of 1996. These predictions were too optimistic. Delivery was initially held up for "political" reasons of copy protection demanded by movie studios, but was later delayed by lack of movie titles. The first players appeared in Japan in November, 1996, followed by U.S. players in March, 1997, with distribution limited to only 7 major cities for the first 6 months.

Players slowly trickled into other regions around the world. Prices for the first players in 1997 were $1000 and up. By the end of 2000, players were available for under $100 at discount retailers. In 2003 players became available for under $50. Six years after the initial launch, close to one thousand models of DVD players were available from over a hundred consumer electronics manufacturers.

Fujitsu released the first DVD-ROM-equipped computer on Nov. 6 in Japan. Toshiba released a DVD-ROM-equipped computer and a DVD-ROM drive in Japan in early 1997 (moved back from December which was moved back from November). DVD-ROM drives from Toshiba, Pioneer, Panasonic, Hitachi, and Sony began appearing in sample quantities as early as January 1997, but none were available before May. The first PC upgrade kits (a combination of DVD-ROM drive and hardware decoder card) became available from Creative Labs, Hi-Val, and Diamond Multimedia in April and May of 1997. In 2014, every major PC manufacturer has models that include DVD-ROM drives.

The first DVD-Audio players were released in Japan by Pioneer in late 1999, but they did not play copy-protected discs. Matsushita (under the Panasonic and Technics labels) first released full-fledged players in July 2000 for $700 to $1,200. DVD-Audio players are now also made by Aiwa, Denon, JVC, Kenwood, Madrigal, Marantz, Nakamichi, Onkyo, Toshiba, Yamaha, and others. Sony released the first SACD players in May 1999 for $5,000. Pioneer's first DVD-Audio players released in late 1999 also played SACD. SACD players are now also made by Accuphase, Aiwa, Denon, Kenwood, Marantz, Philips, Sharp, and others.

Sunday, April 19, 2015

The History about Gadget




The origins of the word "gadget" trace back to the 19th century. According to the Oxford English Dictionary, there is anecdotal (not necessarily true) evidence for the use of "gadget" as a placeholder name for a technical item whose precise name one can't remember since the 1850s; with Robert Brown's 1886 book Spunyarn and Spindrift, A sailor boy’s log of a voyage out and home in a China tea-clipper containing the earliest known usage in print. The etymology of the word is disputed.

A widely circulated story holds that the word gadget was "invented" when Gaget, Gauthier & Cie, the company behind the repoussé construction of the Statue of Liberty (1886), made a small-scale version of the monument and named it after their firm; however this contradicts the evidence that the word was already used before in nautical circles, and the fact that it did not become popular, at least in the USA, until after World War I. Other sources cite a derivation from the French gâchette which has been applied to various pieces of a firing mechanism, or the French gagée, a small tool or accessory.

The October 1918 issue of Notes and Queries contains a multi-article entry on the word "gadget" (12 S. iv. 187). H. Tapley-Soper of The City Library, Exeter, writes:

A discussion arose at the Plymouth meeting of the Devonshire Association in 1916 when it was suggested that this word should be recorded in the list of local verbal provincialisms. Several members dissented from its inclusion on the ground that it is in common use throughout the country; and a naval officer who was present said that it has for years been a popular expression in the service for a tool or implement, the exact name of which is unknown or has for the moment been forgotten. I have also frequently heard it applied by motor-cycle friends to the collection of fitments to be seen on motor cycles. 'His handle-bars are smothered in gadgets' refers to such things as speedometers, mirrors, levers, badges, mascots, &c., attached to the steering handles. The 'jigger' or short-rest used in billiards is also often called a 'gadget'; and the name has been applied by local platelayers to the 'gauge' used to test the accuracy of their work. In fact, to borrow from present-day Army slang, 'gadget' is applied to 'any old thing.'

The usage of the term in military parlance extended beyond the navy. In the book "Above the Battle" by Vivian Drake, published in 1918 by D. Appleton & Co., of New York and London, being the memoirs of a pilot in the British Royal Flying Corps, there is the following passage: "Our ennui was occasionally relieved by new gadgets -- "gadget" is the Flying Corps slang for invention! Some gadgets were good, some comic and some extraordinary."

By the second half of the twentieth century, the term "gadget" had taken on the connotations of compactness and mobility. In the 1965 essay "The Great Gizmo" (a term used interchangeably with "gadget" throughout the essay), the architectural and design critic Reyner Banham defines the item as:

A characteristic class of US products––perhaps the most characteristic––is a small self-contained unit of high performance in relation to its size and cost, whose function is to transform some undifferentiated set of circumstances to a condition nearer human desires. The minimum of skills is required in its installation and use, and it is independent of any physical or social infrastructure beyond that by which it may be ordered from catalogue and delivered to its prospective user. A class of servants to human needs, these clip-on devices, these portable gadgets, have coloured American thought and action far more deeply––I suspect––than is commonly understood.

Saturday, April 18, 2015

Motherboard


A motherboard (sometimes alternatively known as the mainboard, system board, planar board or logic board, or colloquially, a mobo) is the main printed circuit board (PCB) found in computers and other expandable systems. It holds many of the crucial electronic components of the system, such as the central processing unit (CPU) and memory, and provides connectors for other peripherals. Unlike a backplane, a motherboard contains significant sub-systems such as the processor and other components.

Motherboard specifically refers to a PCB with expansion capability and as the name suggests, this board is the "mother" of all components attached to it, which often include sound cards, video cards, network cards, hard drives, or other forms of persistent storage; TV tuner cards, cards providing extra USB or FireWire slots and a variety of other custom components (the term mainboard is applied to devices with a single board and no additional expansions or capability, such as controlling boards in televisions, washing machines and other embedded systems).

CPU sockets

A CPU socket (central processing unit) or slot is an electrical component that attaches to a Printed Circuit Board (PCB) and is designed to house a CPU (also called a microprocessor). It is a special type of integrated circuit socket designed for very high pin counts. A CPU socket provides many functions, including a physical structure to support the CPU, support for a heat sink, facilitating replacement (as well as reducing cost), and most importantly, forming an electrical interface both with the CPU and the PCB. CPU sockets on the motherboard can most often be found in most desktop and server computers (laptops typically use surface mount CPUs), particularly those based on the Intel x86 architecture. A CPU socket type and motherboard chipset must support the CPU series and speed.

Integrated peripherals

Block diagram of a modern motherboard, which supports many on-board peripheral functions as well as several expansion slots
With the steadily declining costs and size of integrated circuits, it is now possible to include support for many peripherals on the motherboard. By combining many functions on one PCB, the physical size and total cost of the system may be reduced; highly integrated motherboards are thus especially popular in small form factor and budget computers.

Disk controllers for a floppy disk drive, up to 2 PATA drives, and up to 6 SATA drives (including RAID 0/1 support)
integrated graphics controller supporting 2D and 3D graphics, with VGA and TV output
integrated sound card supporting 8-channel (7.1) audio and S/PDIF output
Fast Ethernet network controller for 10/100 Mbit networking
USB 2.0 controller supporting up to 12 USB ports
IrDA controller for infrared data communication (e.g. with an IrDA-enabled cellular phone or printer)
Temperature, voltage, and fan-speed sensors that allow software to monitor the health of computer components.
Peripheral card slots
A typical motherboard of 2012 will have a different number of connections depending on its standard.

A standard ATX motherboard will typically have two or three PCI-E 16x connection for a graphics card, one or two legacy PCI slots for various expansion cards, and one or two PCI-E 1x (which has superseded PCI). A standard EATX motherboard will have two to four PCI-Express 16x connection for graphics cards, and a varying number of PCI and PCI-E 1x slots. It can sometimes also have a PCI-E 4x slot (will vary between brands and models).

Some motherboards have two or more PCI-E 16x slots, to allow more than 2 monitors without special hardware, or use a special graphics technology called SLI (for Nvidia) and Crossfire (for ATI). These allow 2 to 4 graphics cards to be linked together, to allow better performance in intensive graphical computing tasks, such as gaming, video editing, etc.

Wednesday, April 15, 2015

Freezer


Freezer



Freezer units are used in households and in industry and commerce. Food stored at or below −18 °C (0 °F) is safe indefinitely. Most household freezers maintain temperatures from −23 to −18 °C (−9 to 0 °F), although some freezer-only units can achieve −34 °C (−29 °F) and lower. Refrigerators generally do not achieve lower than −23 °C (−9 °F), since the same coolant loop serves both compartments: Lowering the freezer compartment temperature excessively causes difficulties in maintaining above-freezing temperature in the refrigerator compartment. Domestic freezers can be included as a separate compartment in a refrigerator, or can be a separate appliance. Domestic freezers are generally upright units resembling refrigerators or chests (upright units laid on their backs). Many modern upright freezers come with an ice dispenser built into their door. Some upscale models include thermostat displays and controls, and flat-screen televisions have even been incorporated.

Effect on lifestyle

The refrigerator allows the modern family to keep food fresh for longer than before. The most notable improvement is for meat and other highly perishable wares, which needed to be refined to gain anything resembling shelf life. (On the other hand, refrigerators and freezers can also be stocked with processed, quick-cook foods that are less healthy.) Refrigeration in transit makes it possible to enjoy foodstuffs from distant places.

Dairy products, meats, fish, poultry and vegetables can be kept refrigerated in the same space within the kitchen (although raw meat should be kept separate from other foodstuffs for reasons of hygiene).

Freezers allow people to buy food in bulk and eat it at leisure, and bulk purchases save money. Ice cream, a popular commodity of the 20th century, could previously only be obtained by traveling to where the product was made and eating it on the spot. Now it is a common food item. Ice on demand not only adds to the enjoyment of cold drinks, but is useful for first-aid, and for cold packs that can be kept frozen for picnics or in case of emergency.

Temperature zones and ratings

File:Theater commercial, electric refrigerator, 1926.ogg
Commercial for electric refrigerators in Pittsburgh, Pennsylvania, 1926
Some refrigerators are now divided into four zones to store different types of food:

−18 °C (0 °F) (freezer)
0 °C (32 °F) (meat zone)
5 °C (41 °F) (cooling zone)
10 °C (50 °F) (crisper)
The capacity of a refrigerator is measured in either litres or cubic feet. Typically the volume of a combined refrigerator-freezer is split to 100 litres (3.53 cubic feet) for the freezer and 140 litres (4.94 cubic feet) for the refrigerator, although these values are highly variable.

Temperature settings for refrigerator and freezer compartments are often given arbitrary numbers by manufacturers (for example, 1 through 9, warmest to coldest), but generally 3 to 5 °C (37 to 41 °F) is ideal for the refrigerator compartment and −18 °C (0 °F) for the freezer. Some refrigerators must be within certain external temperature parameters to run properly. This can be an issue when placing units in an unfinished area, such as a garage. European freezers, and refrigerators with a freezer compartment, have a four star rating system to grade freezers.

[∗]  : min temperature = −6 °C (21 °F). Maximum storage time for (pre-frozen) food is 1 week
[∗∗]  : min temperature = −12 °C (10 °F). Maximum storage time for (pre-frozen) food is 1 month
[∗∗∗]  : min temperature = −18 °C (0 °F). Maximum storage time for (pre-frozen) food is between 3 and 12 months depending on type (meat, vegetables, fish, etc.)
[∗∗∗∗] : min temperature = −18 °C (0 °F). Maximum storage time for pre-frozen or frozen-from-fresh food is between 3 and 12 months
Although both the three and four star ratings specify the same storage times and same minimum temperature of −18 °C (0 °F), only a four star freezer is intended for freezing fresh food, and may include a "fast freeze" function (runs the compressor continually, down to as low as −26 °C (−15 °F)) to facilitate this. Three (or fewer) stars are used for frozen food compartments that are only suitable for storing frozen food; introducing fresh food into such a compartment is likely to result in unacceptable temperature rises. This difference in categorisation is shown in the design of the 4-star logo, where the "standard" three stars are displayed in a box using "positive" colours, denoting the same normal operation as a 3-star freezer, and the fourth star showing the additional fresh food/fast freeze function is prefixed to the box in "negative" colours or with other distinct formatting.

Most European refrigerators include a moist cold refrigerator section (which does require (automatic) defrosting at irregular intervals) and a (rarely frost free) freezer section.

Macintosh

Macintosh



The Macintosh  branded as Mac since 1998) is a series of personal computers (PCs) designed, developed, and marketed by Apple Inc. Steve Jobs introduced the original Macintosh computer on January 24, 1984. This was the first mass-market personal computer featuring an integral graphical user interface and mouse. This first model was later renamed to "Macintosh 128k" for uniqueness amongst a populous family of subsequently updated models which are also based on Apple's same proprietary architecture. Since 1998, Apple has largely phased out the Macintosh name in favor of "Mac", though the product family has been nicknamed "Mac" or "the Mac" since the development of the first model.



The Macintosh, however, was expensive, which hindered its ability to be competitive in a market already dominated by the Commodore 64 for consumers, as well as the IBM Personal Computer and its accompanying clone market for businesses. Macintosh systems still found success in education and desktop publishing and kept Apple as the second-largest PC manufacturer for the next decade. In the 1990s, improvements in the rival Wintel platform, notably with the introduction of Windows 3.0, gradually took market share from the more expensive Macintosh systems. The performance advantage of 68000-based Macintosh systems was eroded by Intel's Pentium, and in 1994 Apple was relegated to third place as Compaq became the top PC manufacturer. Even after a transition to the superior PowerPC-based Power Macintosh line in 1994, the falling prices of commodity PC components and the release of Windows 95 saw the Macintosh user base decline.



In 1998, after the return of Steve Jobs, Apple consolidated its multiple consumer-level desktop models into the all-in-one iMac G3, which became a commercial success and revitalized the brand. Since their transition to Intel processors in 2006, the complete lineup is entirely based on said processors and associated systems. Its current lineup comprises three desktops (the all-in-one iMac, entry-level Mac mini, and the Mac Pro tower graphics workstation), and four laptops (the Macbook, MacBook Air, MacBook Pro, and MacBook Pro with Retina display). Its Xserve server was discontinued in 2011 in favor of the Mac Mini and Mac Pro.



Production of the Mac is based on a partial vertical integration model. While Apple designs its own hardware and creates its own operating system that is pre-installed on all Mac computers, Apple sources components, such as microprocessors, RAM and LCD panels from other vendors. Apple also relies on contract manufacturers like Foxconn and Flextronics to build most of its products. This approach differs from most IBM PC compatibles, where multiple sellers create and integrate hardware intended to run another company's operating system.



Apple also develops the operating system for the Mac, currently OS X version 10.10 "Yosemite". Macs are currently capable of running non-Apple operating systems such as Linux, OpenBSD, and Microsoft Windows with the aid of Boot Camp or third-party software. Apple does not license OS X for use on non-Apple computers, though it did license previous versions of Mac OS through their Macintosh clone program from 1995 to 1997.


The History of Android

History of Android



Android, Inc. was founded in Palo Alto, California in October 2003 by Andy Rubin (co-founder of Danger), Rich Miner (co-founder of Wildfire Communications, Inc.), Nick Sears (once VP at T-Mobile), and Chris White (headed design and interface development at WebTV) to develop, in Rubin's words, "smarter mobile devices that are more aware of its owner's location and preferences".The early intentions of the company were to develop an advanced operating system for digital cameras. Though, when it was realized that the market for the devices was not large enough, the company diverted its efforts toward producing a smartphone operating system that would rival Symbian and Microsoft Windows Mobile. Despite the past accomplishments of the founders and early employees, Android Inc. operated secretly, revealing only that it was working on software for mobile phones. That same year, Rubin ran out of money. Steve Perlman, a close friend of Rubin, brought him $10,000 in cash in an envelope and refused a stake in the company.

Google acquired Android Inc. on August 17, 2005; key employees of Android Inc., including Rubin, Miner, and White, stayed at the company after the acquisition. Not much was known about Android Inc. at the time, but many assumed that Google was planning to enter the mobile phone market with this move. At Google, the team led by Rubin developed a mobile device platform powered by the Linux kernel. Google marketed the platform to handset makers and carriers on the promise of providing a flexible, upgradable system. Google had lined up a series of hardware component and software partners and signaled to carriers that it was open to various degrees of cooperation on their part.

Speculation about Google's intention to enter the mobile communications market continued to build through December 2006. An earlier prototype codenamed "Sooner" had a closer resemblance to a BlackBerry phone, with no touchscreen, and a physical, QWERTY keyboard, but was later re-engineered to support a touchscreen, to compete with other announced devices such as the 2006 LG Prada and 2007 Apple iPhone. In September 2007, InformationWeek covered an Evalueserve study reporting that Google had filed several patent applications in the area of mobile telephony.


Eric Schmidt, Andy Rubin, and Hugo Barra at a press conference for the Google's Nexus 7 tablet
On November 5, 2007, the Open Handset Alliance, a consortium of technology companies including Google, device manufacturers such as HTC, Sony and Samsung, wireless carriers such as Sprint Nextel and T-Mobile, and chipset makers such as Qualcomm and Texas Instruments, unveiled itself, with a goal to develop open standards for mobile devices. That day, Android was unveiled as its first product, a mobile device platform built on the Linux kernel version 2.6.25. The first commercially available smartphone running Android was the HTC Dream, released on October 22, 2008.

In 2010, Google launched its Nexus series of devices – a line of smartphones and tablets running the Android operating system, and built by manufacturing partners. HTC collaborated with Google to release the first Nexus smartphone, the Nexus One. Google has since updated the series with newer devices, such as the Nexus 5 phone (made by LG) and the Nexus 7 tablet (made by Asus). Google releases the Nexus phones and tablets to act as their flagship Android devices, demonstrating Android's latest software and hardware features. On March 13, 2013 Larry Page announced in a blog post that Andy Rubin had moved from the Android division to take on new projects at Google. He was replaced by Sundar Pichai, who also continues his role as the head of Google's Chrome division, which develops Chrome OS.

Since 2008, Android has seen numerous updates which have incrementally improved the operating system, adding new features and fixing bugs in previous releases. Each major release is named in alphabetical order after a dessert or sugary treat; for example, version 1.5 Cupcake was followed by 1.6 Donut. Version 4.4.4 KitKat appeared as a security-only update; it was released on June 19, 2014, shortly after 4.4.3 was released. As of November 2014, the newest version of the Android operating system, Android 5.0 "Lollipop", is available for selected devices.

From 2010 to 2013, Hugo Barra served as product spokesperson for the Android team, representing Android at both press conferences and Google I/O, Google’s annual developer-focused conference. Barra’s product involvement included the entire Android ecosystem of software and hardware, including Honeycomb, Ice Cream Sandwich, Jelly Bean and KitKat operating system launches, the Nexus 4 and Nexus 5 smartphones, the Nexus 7 and Nexus 10 tablets,and other related products such as Google Now and Google Voice Search, Google’s speech recognition product comparable to Apple’s Siri. In 2013, Barra left the Android team for Chinese smartphone maker Xiaomi.

Wednesday, April 8, 2015

Computer Monitor

Computer Monitor

A monitor or a display is an electronic visual display for computers. The monitor comprises the display device, circuitry and an enclosure. The display device in modern monitors is typically a thin film transistor liquid crystal display (TFT-LCD) thin panel, while older monitors used a cathode ray tube (CRT) about as deep as the screen size.

Originally, computer monitors were used for data processing while television receivers were used for entertainment. From the 1980s onwards, computers (and their monitors) have been used for both data processing and entertainment, while televisions have implemented some computer functionality. The common aspect ratio of televisions, and then computer monitors, has also changed from 4:3 to 16:9.

History

Early electronic computers were fitted with a panel of light bulbs where the state of each particular bulb would indicate the on/off state of a particular register bit inside the computer. This allowed the engineers operating the computer to monitor the internal state of the machine, so this panel of lights came to be known as the 'monitor'. As early monitors were only capable of displaying a very limited amount of information, and were very transient, they were rarely considered for programme output. Instead, a line printer was the primary output device, while the monitor was limited to keeping track of the programme's operation.

As technology developed it was realized that the output of a CRT display was more flexible than a panel of light bulbs and eventually, by giving control of what was displayed to the programme itself, the monitor itself became a powerful output device in its own right.

Tuesday, April 7, 2015

History of iPhone

History of iPhone



Development of what was to become the iPhone began in 2004, when Apple started to gather a team of 1000 employees to work on the highly confidential "Project Purple",including Jonathan Ive, the designer behind the iMac and iPod.Apple CEO Steve Jobs steered the original focus away from a tablet, like the iPad, and towards a phone.[29] Apple created the device during a secretive collaboration with AT&T Mobility—Cingular Wireless at the time—at an estimated development cost of US$150 million over thirty months.

Apple rejected the "design by committee" approach that had yielded the Motorola ROKR E1, a largely unsuccessful collaboration with Motorola. Instead, Cingular gave Apple the liberty to develop the iPhone's hardware and software in-house and even paid Apple a fraction of its monthly service revenue (until the iPhone 3G), in exchange for four years of exclusive US sales, until 2011.

Jobs unveiled the iPhone to the public on January 9, 2007, at the Macworld 2007 convention at the Moscone Center in San Francisco. The two initial models, a 4 GB model priced at US$499 and an 8 GB model at US$599 (both requiring a 2 year contract), went on sale in the United States on June 29, 2007, at 6:00 pm local time, while hundreds of customers lined up outside the stores nationwide. The passionate reaction to the launch of the iPhone resulted in sections of the media dubbing it the 'Jesus phone'. Following this successful release in the US, the first generation iPhone was made available in the UK, France, and Germany in November 2007, and Ireland and Austria in the spring of 2008.


Worldwide iPhone availability:

  •   iPhone available since its original release
  •   iPhone available since the release of iPhone 3G
  •   Coming soon

On July 11, 2008, Apple released the iPhone 3G in twenty-two countries, including the original six. Apple released the iPhone 3G in upwards of eighty countries and territories. Apple announced the iPhone 3GS on June 8, 2009, along with plans to release it later in June, July, and August, starting with the US, Canada and major European countries on June 19. Many would-be users objected to the iPhone's cost, and 40% of users have household incomes over US$100,000.

The back of the original first generation iPhone was made of aluminum with a black plastic accent. The iPhone 3G and 3GS feature a full plastic back to increase the strength of the GSM signal. The iPhone 3G was available in an 8 GB black model, or a black or white option for the 16 GB model. The iPhone 3GS was available in both colors, regardless of storage capacity.

The iPhone 4 has an aluminosilicate glass front and back with a stainless steel edge that serves as the antennas. It was at first available in black; the white version was announced, but not released until April 2011, 10 months later.

The iPhone has gained positive reviews from such critics as David Pogue and Walt Mossberg. The iPhone attracts users of all ages, and besides consumer use, the iPhone has also been adopted for business purposes.

Users of the iPhone 4 reported dropped/disconnected telephone calls when holding their phones in a certain way. This became known as antennagate.

On January 11, 2011, Verizon announced during a media event that it had reached an agreement with Apple and would begin selling a CDMA iPhone 4. Verizon said it would be available for pre-order on February 3, with a release set for February 10. In February 2011, the Verizon iPhone accounted for 4.5% of all iPhone ad impressions in the US on Millennial Media's mobile ad network.

From 2007 to 2011, Apple spent $647 million on advertising for the iPhone in the US.

On Tuesday, September 27, Apple sent invitations for a press event to be held October 4, 2011, at 10:00 am at the Cupertino Headquarters to announce details of the next generation iPhone, which turned out to be iPhone 4S. Over 1 million 4S models were sold in the first 24 hours after its release in October 2011. Due to large volumes of the iPhone being manufactured and its high selling price, Apple became the largest mobile handset vendor in the world by revenue, in 2011, surpassing long-time leader Nokia. American carrier C Spire Wireless announced that it would be carrying the iPhone 4S on October 19, 2011.

In January 2012, Apple reported its best quarterly earnings ever, with 53% of its revenue coming from the sale of 37 million iPhones, at an average selling price of nearly $660. The average selling price has remained fairly constant for most of the phone's lifespan, hovering between $622 and $660. The production price of the iPhone 4S was estimated by IHS iSuppli, in October 2011, to be $188, $207 and $245, for the 16 GB, 32 GB and 64 GB models, respectively. Labor costs are estimated at between $12.50 and $30 per unit, with workers on the iPhone assembly line making $1.78 an hour.

In February 2012, ComScore reported that 12.4% of US mobile subscribers used an iPhone. Approximately 6.4 million iPhones are active in the US alone.

On September 12, 2012, Apple announced the iPhone 5. It has a 4-inch display, up from its predecessors' 3.5-inch screen. The device comes with the same 326 pixels per inch found in the iPhone 4 and 4S. The iPhone 5 has the SoC A6 processor, the chip is 22% smaller than the iPhone 4S' A5 and is twice as fast, doubling the graphics performance of its predecessor. The device is 18% thinner than the iPhone 4S, measuring 7.6 millimetres (0.3 in), and is 20% lighter at 112 grams (4 oz).

On July 6, 2013, it was reported that Apple was in talks with Korean mobile carrier SK Telecom to release the next generation iPhone with LTE Advanced technology.

On July 22, 2013 the company's suppliers said that Apple is testing out larger screens for the iPhone and iPad. "Apple has asked for prototype smartphone screens larger than 4 inches and has also asked for screen designs for a new tablet device measuring slightly less than 13 inches diagonally, they said."

On September 10, 2013, Apple unveiled two new iPhone models during a highly anticipated press event in Cupertino. The iPhone 5C, a mid-range-priced version of the handset that is designed to increase accessibility due to its price is available in five colors (green, blue, yellow, pink, and white) and is made of plastic. The iPhone 5S comes in three colors (black, white, and gold) and the home button is replaced with a fingerprint scanner (Touch ID). Both phones shipped on September 20, 2013.

On September 9, 2014, Apple revealed the iPhone 6 and the iPhone 6 Plus at an event in Cupertino. Both devices had a larger screen than their predecessor, at 4.7 and 5.5 inches respectively.

In January 2015, "Apple stands on second slot i.e only 31% in US market". Competing devices with Android operating system have a market share approximately 62% of the US, 82.7% of the Chinese market, and 73.3% of the European market (countries such as the UK, France, Germany Spain and Italy).