Thursday, December 18, 2008

SSD From Samsung To Boost Up HP Servers



Samsung has announced its new product in Korea. The launches are two new Hi-performance SSDs. The capacities of the SSDs are 32 GB and 64 GB especially design for server use. These can achieve 100 MB in reading mode and 80 MB in writing mode. Both models have been available from the end of September. At the beginning these both devices are planned to use with HP Servers like the “ProLiant BL495c G5 Virtualization Blade Server.”

Toshiba Sets A Record With 512GB SSD



Yesterday Toshiba has announced its new flash storage of 2.5 inch 512GB solid state drive and it has set a record. It makes the use of the company’s newest 43 nanometer assembly process and particularly dense multi-level cell storage to double the 256GB record only just set earlier this year. Moreover the new device brings the faster performance and peak read of 240MB per second and writes speed of 200MB per second.

Friday, November 21, 2008

Fluresence Multi layer Disk


The first generation of disc productions from Constellation 3D will be a family of 120 mm multi-layer FM-discs with capacity up to 140 GBytes and with read speed up to 1 GBytes/s. It's interesting to compare them with DVD that keep up to 17.4 GBytes. New discs will be capable for example to keep up to 20 hours of compressed movies in HDTV format.

Compact discs were a revolutionary product at its time and influenced many spheres of human activity. People started recording music of high quality which didn't get worse with the time as it 
happens to be on tape or vinyl. As soon as CDs appeared in computer industry they immediately became an undoubted helper both for users and for programmers. The latter were able to increase volume of their program products by adding video and audio elements etc. Later discs were used for digital video (VideoCD).
 
But technologies are progressing. Data are growing faster and faster. A usual CD is far not enough (640 MBytes). So, there appeared DVD technology. Of course we are happy with those 17 GBytes that can be kept on one DVD disc, but this is a limiting point. 



FMD-ROM: operating principle

In optical discs such as CD, DVD and MÎ the process of reading is implemented the following way. A beam of a semi-conducting laser gets on the surface of an informational layer and then reflects from aluminum (or any other metallic) layer and fixed with a detector-receiver. In FMD there is no reflected laser beam: when a laser beam reaches an informational layer the latter starts radiating. 

The principle of operation of FM-discs is based on a phenomenon of photochromism. Some years ago russian chemists discovered a stable organic material a "stable photochrome" which when acted upon by a laser beam obtains fluorescent properties.

The matter is that an informational element of FM-disc (photochrome) can change its physical properties (such as color and presence of fluorescence) under influence of a laser of a definite power and wavelength. Initially photochrome doesn't possess fluorescent properties. When switching on a laser a photochemical reaction starts what causes fluorescent properties to appear. When reading, this matter becomes excited again but with a laser of lower power. The fluorescence is caught up by a photo-receiver and is fixed as a value "1".

In usual optical discs (CD/DVD) with increasing number of informational layers a signal gets worse. It's explained by the fact that these technologies use a reflected signal, it means that there is necessity in mirror surfaces. That's why in DVD technology an external layer is made to be semitransparent in order to allow a laser to reach an internal one.

And a signal while passing an external layer leaves a part of its energy because of reflecting. Signals reflected from both layers interfere because of their coherence, it results in losses of useful signal. Increasing number of layers aggravates an effect of multiple interference between the layers what makes reading more complicated. The problem can be solved by improving detector-receivers, but it is possible only in laboratory. In case of fluorescent discs the quality of the signal gets worse much slower with increasing number of layers. Look at the graph below: According to FMD-ROM developers, even with a hundred layers a useful signal will be acceptable.

Here is a list of advantages of FMD/C:

  1. Multilayer disc is transparent and homogeneous
  2. Small loss of useful signal while passing through several layers
  3. Fluorescence of a separate element easily passes through disc layers
  4. Less sensitivity (than of CD/DVD) to different imperfections of reading devices. Fluorescent technology doesn't require special manufacturing conditions
  5. Reflective fluorescent light from any layer is not coherent, it prevents a problem of multiple interferences
  6. FMD-technology is compatible with CD and DVD formats supporting the same data distribution system on each layer.

Parameter  

     CD 

     DVD

     FMD

Disc diameter, mm

120

120

130

Capacity, GBytes

0,64

17,4

50,8

Number of layers

1

2 (on each side)

12

Distance between layers , micron

-

40

25±5

Total width of informational layers, micron

0,11

2

275

Format

CD

DVD

Modified DVD

Distance between tracks, micron

1,6

0,74

0,8

Optical system:
Wavelength, nm

780

635-650

532

 

Friday, November 7, 2008

Toshiba Qosmio More than a laptop



An extraordinary family of laptops with a―well, far-from-ordinary name. Pronounced “KOZ-me-o,” these amazing machines are designed to be an exciting epicenter for all the entertainment you love, from non-stop gaming action to feature films, home movies, photos, and your favorite songs. Brace yourself. When it comes to fun our new Qosmio notebooks are way off the Richter scale.

When it comes to multimedia our Qosmio brand rules. That includes our 18.4” G55 with an amazing Toshiba Quad Core HD Processor for HD video processing, our 17” diagonal screen gaming dynamo, the X305, with its premier graphics system, all the way to our mobile 15.4” diagonal screen F55 laptop with built-in Garmin® GPS. Whichever you choose, thanks to stunning TruBrite® display technology and other high-end features, you’re in for some serious thrills. Hail, Qosmio!

Wherever you go, bring nothing less than your A-game. Because our new Qosmio X305 notebooks now offer powerful NVIDIA® graphics systems with up to an amazing 1GB of discrete graphics memory to put your skill―and your very nerve―to the ultimate test. You heard us right. That’s up to one full 1GB!
ownload your favorite music, build special playlists, then sit back and get ready. Up to four bass reflex stereo speakers and a sub-woofer on these Qosmio laptops will envelop you in a full, rich blanket of sound, delivering brighter highs and more luxurious lows. Where’s the sweet spot? Right where you are.
Super-size your entertainment most anywhere. It’s easy with new Toshiba REGZA-Link® (HDMI-CEC) technology on the Qosmio series. Now you can blast spectacular high-def pictures and magnificent 5.1 virtual surround sound through a home theater system, and run all the components using a single remote control. Now how cool is that? And how
totally convenient too!

Toshiba Satellite E105's Dull 'Silver' Finish

Toshiba's new 14.1-inch laptop, the Satellite E105, has earned the company more green laurels and a Silver classification from the Green Electronics Council's EPEAT program. That sounds great, but when you look closely, Silver has a dull finish.

Last year, EPEAT (the Electronic Product Environmental Assessment Tool) gave the Toshiba Portégé R500 a Gold ranking. So, what stopped the Satellite E105 from coming home with the Gold? Lead. And mercury. Toshiba, like many computer manufacturers, hasn't yet eliminated the intentional addition of the two toxins in manufacturing its laptops. Toshiba also failed to use any renewable or bio-based plastics material (though EPEAT did give it credit for declaring this deficit).

Also to its credit -- both from EPEAT and environmentalists at large -- Toshiba provides a take-back service for its laptops and its rechargeable battery pack. It's called the Toshiba Free Electronic Program and lets you recycle your old electronics regardless of the brand and at no cost to you. Toshiba also provides the shipping label. (Dell, to its credit, had been on the vanguard of such consumer- and eco-friendly measures).

Toshiba deservedly boasts that its products are compatible with two world-recognized directives:WEEE (Waste from Electrical and Electronic Equipment), which protects the environment by promoting the reuse, recycling, and recovery of electronic materials; and RoHS (Restriction on Hazardous Substance Standard), which seeks to reduce the use of manufacturing substances that harm human health or the environment. "The result? We've kept tons of toxic chemicals and other substances out of the ecosystem," Toshiba notes in its annual corporate social responsibility report. Those efforts include recovering and recycling tin used in manufacturing the Portégé R500 series at its Toshiba Information Equipment (Hangzhou) plant in China.

But given the tons of Toshiba-labeled plastics that are never recovered and end up in trash heaps around the globe, I encourage Toshiba to again go for the Gold.

AMD Toasts Dell’s Quad-Core Opteron Servers

Late to the party but celebrating nonetheless, AMD is announcing that Dell is offering five quad-core Opteron server platforms, reports The Register. It’s not clear why AMD and not Dell is throwing the confetti, though.

        

The line includes the PowerEdge SC1435, 2970, M605 blade server and 6950 platforms, as well as the PowerEdge T605 tower server. ZDNet blogger Larry Dignan reports VMware also has completed certification of the quad-core AMD Opterons.

As eWEEK notes, AMD needs some good news, after recently announcing more layoffs amid dismal financial performance.

Meanwhile, rival Intel reported strong sales in the quarterand share prices jumped with its rosy forecast for the remainder of 2008, reports The San Jose Mercury News. Investors have been watching the company as a bellwether of the tech industry’s health in the slumping U.S. economy.

Apple Becomes No. 2 Smartphone

Apple’s popular iPhone has elbowed its way past RIM’s BlackBerry to claim the No. 2 spot in the global smartphone market in the third quarter, reports vnunet.com

According to figures from research firm Canalys, overall smartphone shipments increased 28 percent over the same period a year ago.

Apple shipped nearly 7 million devices, taking 17.3 percent of the market. Still, shipments by No. 3 RIM grew more than 80 percent, giving it 15.2 percent market share. Nokia retains in the global crown,  though its shipments decreased slightly to 15 million units and a 38.4 percent market share.

Meanwhile, in the U.S. overall mobile phone market, Samsung toppled Motorola for the top spot with phones that do much that the iPhone does, according to figures by research firm Strategy Analytics.

Bloomberg reports South Korea’s Samsung nabbed 22.4 percent of the U.S. market, up from 17.9 percent a year ago. Motorola, No. 4 in the global market, lost share from 32.7 percent a year ago to 21.1 percent. Analysts say it needs to add more smartphones to its lineup.

Thursday, November 6, 2008

3G Spectrum Lisence is planned to be allocated n India


After a long wait, the Indian government has decided that it will auction 3G spectrum, and will allow foreign players to participate in the bidding. The spectrum is going to be released in the 2100 MHz band, which would make it compatible with rest of the world, barring a few countries such as the U.S.

Here are some specifics:

  • Winners will be required to pay 0.5 percent of total adjusted gross revenue to the state on an annual basis for the first three years of operation, rising to 1 percent thereafter.
  • Winners will not be allowed to trade or resell the spectrum and they will not be allowed to merge in the first five years after the grant.
  • About 30MHz of spectrum will be sold in total.
  • India will also auction licenses for broadband wireless (WiMAX) services, with the reserve price fixed at 25 percent of the amount for 3G spectrum.

The spectrum is going to be plentiful; our sources say that it will be enough to accommodate six carriers. The carriers we expect will make it to the finish line include the current leader, Bharti Airtel; Idea Telecom; Reliance and Vodafone (VOD). The two international cell phone companies likely to win the spectrum bid include AT&T (T) and Sistema.

If the rollout of voice services over past decade is any indication, this is a big opportunity for equipment makers. Ericsson (ERICY) and Nokia (NOK) have done well in India, and there is little reason why the situation would change, though one suspects the Chinese equipment vendors are going to act as deflationary counterweights to their Western counterparts.

The 3G services in India will eventually have to compete with WiMAX, which is being seen as the wireless broadband technology of choice by lawmakers. More importantly, this auction is a way for the Indian government to keep the telecom sector specifically and the economy overall growing at a healthy clip. India’s economy was catalyzed by telecom and call center businesses, then spread to other sectors.

Wednesday, September 3, 2008

Blu-ray Disc




Blu-ray Disc is an optical Storage disk media format. Its main uses are high definition video and data storage. The disc has the same dimensions as a standard DVD or CD. The name Blu-ray Disc is derived from the blue laser used to read and write this type of disc. Because of its shorter wavelength (405 nm), substantially more data can be stored on a Blu-ray Disc than on the DVD format, whose wave length is ( 650 nm). A dual layer Blu-ray Disc can store 50 GB , almost six times the capacity of a double-dual layer DVD.






During the high definition optical disk format war. Blu-ray Disc competed with the HD DVD format. On February 19, 2008, Toshiba — the main company supporting HD DVD — announced it would no longer develop, manufacture, and market HD DVD players and recorders.Blu-ray Disc was developed by the Blu-ray Disc Association, a group representing consumer electronics, computer hardware, and motion picture production. As of August 27, 2008 more than 780 Blu-ray Disc titles have been released in the United States and more than 480 Blu-ray Disc titles have been released in Japan. Thus Blue ray become mostpopular in the coming year,rejecting CD s and DVDs.....
Because the Blu-ray Disc data layer is closer to the surface of the disc, compared to the DVD standard, it was at first more vulnerable to scratches. The first discs were housed in cartridges for protection.TDK was the first company to develop a working scratch protection coating for Blu-ray Discs. It was named Durabis. In addition, both Sony and Panasonic's replication methods include proprietary hard-coat technologies. Sony's rewritable media are spin-coated with a scratch-resistant and antistatic coating. Verbatim's recordable and rewritable Blu-ray Disc discs use their own proprietary hard-coat technology called ScratchGuard.

Broadband

Broadband in telecommunication refers to a signaling method that includes or handles a relatively wide range of frequencies, which may be divided into channels or frequency bins. The wider the bandwidth, the greater the information-carrying capacity. Bandwidth is the capability of carrying amount of data at a time. Although various minimum speeds have been used in definitions of broadband, ranging up from 64 kbit/s up to 1.0 Mbit/s, the 2006 OECD report is typical in counting only download speeds equal to or faster than 256 kbit/s as broadband, and the US FCC currently defines broadband as anything above 768 kbit/s.


Broadband transmission rates
Connection




Transmission Speed




DS-1 (Tier 1)




1.54 Mbit/s




E-1




2.04 Mbit/s




DS-3 (Tier 3)




44.73 Mbit/s




OC-3




155.52 Mbit/s




OC-12




622.08 Mbit/s




OC-48




2.48 Gbit/s




OC-192




9.95 Gbit/s




OC-768




39.81 Gbit/s




OC-1536




79.60 Gbit/s




OC-3072




159.20 Gbit/s





Satellite Internet

This employs a satellite in geostationary orbit to relay data from the satellite company to each customer. Satellite Internet is usually among the most expensive ways of gaining broadband Internet access, but in rural areas it may only compete with cellular broadband. However, costs have been coming down in recent years to the point that it is becoming more competitive with other high-speed options.Satellite Internet also has a high latency problem caused by the signal

having to travel 35,000 km (22,000 miles) out into space to the satellite and back to Earth again. The signal delay can be as much as 500 ms to 900 ms, which makes this service unsuitable for applications requiring real-time user input such as certain multiplayer Internet games and first person shooter played over the connection. Despite this, it is still possible for many games to be played, but the scope is limited to realtime stategy or turnbased games.






Advantages
  • True global broadband Internet access availability
  • Mobile connection to the Internet
Disadvantages
  • High latency compared to other broadband services, especially 2-way satellite service
  • Unreliable: drop-outs are common during travel, inclement weather, and during sunspot activity
  • The narrow-beam highly directional antenna must be accurately pointed to the satellite orbiting overhead
  • The Fair Access Policy limits heavy usage, if applied by the service provider
  • One-way satellite service requires the use of a modem or other data uplink connection
  • Satellite Dishes are very large. Although most of them employ plastic to reduce weight, they are typically between 80 and 120 cm (30 to 48 inches) in diameter.

Cellular broadband

Cellular phone towers are very widespread, and as cellular networks move to third generation (3G) networks they can support fast data; using technologies such as EVDO, HSDPA and UMTS.

These can give broadband access to the Internet, with a cell phone, with Cardbus, Expresscard, orUSB cellular modems, or with cellular broadband routers, which allow more than one computer to be connected to the Internet using one cellular connection.




Advantages

  • The only broadband connection available on many cell phones

  • Mobile wireless connection to the Internet
  • Available in all metropolitian areas, most large cities, and along major highways.
  • No need to aim an antenna in most cases
  • The antenna is extremely small compared to a satellite dish
  • Low latency compared to satellite internet
  • Higher availability than WiFi
  • A traveler who already has cellular broadband will not need to pay different WiFi Hot Spot providers for access.
Disadvantages
  • Unreliable: drop-outs are common during travel and during inclement weather
  • Not a truly nationwide service
  • Speed can vary widely throughout the day, sometimes falling well below the 400 kbit/s target during peak times.
  • High latency compared to DSL and Cable broadband services.
  • Often more costly compared to other methods

4G


After the evolution of 3G network comes the next generation , 4G is Fourth-Generation. It is a term used to describe the next complete evolution in wireless communications. It is the advanced version of 3G network. A 4G system will be able to provide a comprehensive IP solution where voice, data and streamed multimedia can be given to users on an "Anytime, Anywhere" basis, and at higher data rates than previous generations.4G is being developed to accommodate the quality of service and rate requirements set by forthcoming applications like wireless broadband access, Multimedia Messaging Service (MMS), Video chat , Mobile T V, HD TV content, Digital Video Broadcasting (DVB), minimal service like voice and data, and other streaming services for "anytime-anywhere".



Objectives of the 4G wireless communication standard
  • A Spectrally efficient system (in bits/s/Hz and bits/s/Hz/site).
  • High network capacity: more simultaneous users per cell.
  • A nominal data rate of 100 Mbit/s while the client physically moves at high speeds relative to the station, and 1 Gbit/s while client and station are in relatively fixed positions as defined by the ITU-R.
  • A data rate of at least 100 Mbit/s between any two points in the world.
  • Smooth handoff across heterogeneous networks.
  • Seamless connectivity and global roaming across multiple networks.
  • High quality of service for next generation multimedia support (real time audio, high speed data, HDTV video content, mobile TV, etc).
  • Interoperability with existing wireless standards.
  • An all IP, packet switched network.

3G


3G is the third generation of mobile phone standards and technology. 3G technologies enable network operators to offer users a wider range of more advanced services while achieving greater network capacity through improved spectral efficiency. Services include wide-area wireless voice chatting, Viedeo Calls, and broadband wireless data, all in a mobile environment. Additional features also include HSPA data transmission capabilities able to deliver speeds up to 14.4Mbit/s on the downlink and 5.8Mbit/s on the uplink. Now most of the service providers are providing 3Gs and most phones are supporting it.



The first pre-commercial 3G network was launched by NTT DoCoMo in Japan branded FOMA , in May 2001 on a pre-release of W CDMA GA3Y technology. The first commercial launch of 3G was also by NTT DoCoMo in Japan on October 1, 2001.

The ITU has not provided a clear definition of the speeds users can expect from 3G equipment or providers.a minimum speed of 2Mbit/s for stationary or walking users, and 348 [sic] kbit/s in a moving vehicle.It is often suggested by industry sources that 3G can be expected to provide 384 kbit/s at or below pedestrian speeds, but only 128 kbit/s in a moving car. EDGE is a part of 3 G standard.

3G networks offer a greater degree of security than 2G predecessors. By allowing the UE to authenticate the network it is attaching to, the user can be sure the network is the intended one and not an impersonator. 3G networks use the KASUMI block crypto instead of the older A5/1 stream cipher.

Bluetooth is a wireless protocol utilizing short-range communications technology facilitating data transmission over short distances from fixed and/or mobile devices. . Bluetooth uses a radio technology called frequency hopping spread spectrum. It chops up the data being sent and transmits chunks of it on up to 75 different frequencies. In its basic mode, the modulation is Gaussian frequency shift keying (GFSK). It can achieve a gross data rate of 1 Mb/s. Bluetooth provides a way to connect and exchange information between devices such as Mobile Phones, Telephones,Laptops,Personal Computers,Printers, Digital Cameras etc ...over a secure, globally unlicensed Industrial, Scientific, and Medical (ISM) 2.4 GHz short-range Radio Frequency bandwidth.
















Applications of BlueTooth

  • Wireless control of and communication between a Mobile Phone and a Hands Free headsets
  • Wireless communications with PC input and output devices, the most common being the mouse keyboard and printers.

Global Positioning System


A GPS receiver calculates its position by carefully timing the signals sent by the constellation of GPS satellites high above the Earth. Each satellite continually transmits messages containing the time the message was sent, a precise orbit for the satellite sending the messageand the general system health and rough orbits of all GPS satellites.These signals travel at the speed of light through outer space, and slightly slower thrrough atmosphere. The receiver uses the arrival time of each message to measure the distance to each satellite, from which it determines the position of the receiverThe resulting coordinates are converted to more user-friendly forms such as latitude and longitude, or location on a map, then displayed to the user. It might seem that three satellites would be enough to solve for a position, since space has three dimensions. However, a three satellite solution requires the time be known to a nanosecond or so, far better than any non-laboratory clock can provide. Using four or more satellites allows the receiver to solve for time as well as geographical position, eliminating the need for a very accurate clock. In other words, the receiver uses four measurements to solve for four variables: x, y, z, and t. While most GPS applications use the computed location and not the (very accurate) computed time, the time is used in some GPS applications such as time transfer and traffic signal timing.





















How GPS works....

Applications

The Global Positioning System, while originally a military project, is considered a dual-usetechnology, meaning it has significant applications for both the military and the civilian industry.

Military
  • Navigation: GPS allows soldiers to find objectives in the dark or in unfamiliar territory, and to coordinate the movement of troops and supplies. The GPS-receivers commanders and soldiers use are respectively called the Commanders Digital Assistant and the Soldier Digital Assistant.
  • Target tracking: Various military weapons systems use GPS to track potential ground and air targets before they are flagged as hostile.These weapon systems pass GPS co-ordinates of targets to precicion guided munitions to allow them to engage the targets accurately. Military aircraft, particularly those used in air to ground roles use GPS to find targets
  • Missile and projectile guidance: GPS allows accurate targeting of various military weapons including ICBMs, Cruise Missiles and Guided Missiles.
  • Search and Rescue: Downed pilots can be located faster if they have a GPS receiver.
  • Reconnaissance and Map Creation: The military use GPS extensively to aid mapping and reconnaissance.
  • The GPS satellites also carry a set of nuclear detonation detectors consisting of an optical sensor (Y-sensor), an X-ray sensor, a dosimeter, and an Electro-Magnetic Pulse (EMP) sensor (W-sensor) which form a major portion of the United States Nuclear Detonation System.

Civilian

  • The ability to determine the receiver's absolute location allows GPS receivers to perform as a surveying tool or as an aid to navigation.
  • The capacity to determine relative movement enables a receiver to calculate local velocity and orientation, useful in vehicles or observations of the Earth.
  • GPS enables researchers to explore the Earth environment including the atmosphere, ionosphere and gravity field.
  • GPS survey equipment has revolutionized tectonics by directly measuring the motion of faults in earthquakes.
  • PS functionality has now started to move into mobile phone. Eg NOKIA N95 has this feature availiable in it.

Electronic Readers

Instead of Paper Sony Corporation had devoloped Electronic Reader, which stores the data like Texts and photos in a pdf format, It has a QVGA screen which is a touch sensitivity screen on ehich we can select using our on fingers. No external appliances or devices are required for the use of this Readers.

Finger Prints


A fingerprint is an impression of the friction ridges of all or any part of the finger. Fingerprint identification Referred to as dactyloscopy , is the process of comparing questioned and known friction skin ridge impressions from fingers to determine if the impressions are from the same finger . The flexibility of friction ridge skin means that no two finger or palm prints are ever exactly alike determines that two friction ridge impressions originated from the same finger.

A new method of detecting fingerprints

Since the late nineteenth century, fingerprint identification methods have been used by police agencies around the world to identify both suspected criminals as well as the victims of crime. The basis of the traditional fingerprinting technique is simple. The skin on the palmar surface of the hands and feet forms ridges, so-called papillary ridges, in patterns that are unique to each individual and which do not change over time. Even identical twins do not have identical fingerprints. Fingerprints on surfaces may be described as patent or latent. Patent fingerprints are left when a substance (such as paint, oil or blood) is transferred from the finger to a surface and are easily photographed without further processing. Latent fingerprints, in contrast, occur when the natural secretions of the skin are deposited on a surface through fingertip contact, and are usually not readily visible. The best way to render latent fingerprints visible, so that they can be photographed, is complex and depends, for example, on the type of surface involved. It is generally necessary to use a ‘developer’, usually a powder or chemical reagent, to produce a high degree of visual contrast between the ridge patterns and the surface on which the fingerprint was left.
Developing agents depend on the presence of organic deposits for their effectiveness. However, fingerprints are typically formed by the secretions of the eccrine glands of the fingertips, which principally comprise water and inorganic salts, with only a small proportion of organic material such as urea and amino acids and detecting such fingerprints is far from easy. A further complication is the fact that the organic component of any deposited material is readily destroyed by heat, such as occurs when a gun is fired or a bomb is detonated, when the temperature may reach as high as 500°C. In contrast, the non-volatile, inorganic component of eccrine secretion remains intact even when exposed to temperatures as high as 600°C.
Within the Materials Research Centre,University of Swansea, UK, Professor Neil McMurray and Dr Geraint Williams have developed a technique that enables fingerprints to be visualised on metallic and electrically conductive surfaces without the need to develop the prints first. The technique involves the use of an instrument called a Scanning Kelvin Probe.which measures the voltage, or electrical potential, at pre-set intervals over the surface of an object on which a fingerprint may have been deposited. These measurements can then be mapped to produce an image of the fingerprint. A higher resolution image can be obtained by increasing the number of points sampled, but at the expense of the time taken for the process. A sampling frequency of 20 points per mm is high enough to visualise a fingerprint in sufficient detail for identification purposes and produces a voltage map in 2–3 hours. So far the technique has been shown to work effectively on a wide range of forensically important metal surfaces including iron, steel and aluminium. While initial experiments were performed on planar, i.e. flat, surfaces, the technique has been further developed to cope with severely non-planar surfaces, such as the warped cylindrical surface of fired cartridge cases. The very latest research from the department has found that physically removing a fingerprint from a metal surface, e.g. by rubbing with a tissue, does not necessarily result in the loss of all fingerprint information. The reason for this is that the differences in potential that are the basis of the visualisation are caused by the interaction of inorganic salts in the fingerprint deposit and the metal surface and begin to occur as soon as the finger comes into contact with the metal, resulting in the formation of metal–ion complexes that cannot easily be removed.


Usage of Finger Prints in Locks and other Applications

In the 2000s, electronic fingerprint readers have been introduced for security applications such as identification of computer users (log-in authentication). However, early devices have been discovered to be vulnerable to quite simple methods of deception, such as fake fingerprints cast in gels. In 2006, fingerprint sensors gained popularity in the notebook PC market. Built-in sensors in all Laptops of Dell, Thoshiba Sony VAIO laptops, instead of Passwords and others also double as motion detectors for document scrolling, like the scroll wheel.


Iris recognition



Iris recognition is a method of biometric authentication that uses pattern recognition techniques based on high-resolution images of the irides of an individual's eyes. Iris recognition uses camera technology, with subtle IR illumination reducing specular reflection from the convex cornea, This create images of the detail-rich, intricate structures of the iris. Converted into digital image, these images provide mathematical representations of the iris that yield unambiguous positive identification of an individual.
In Iris Scan Retinal scanning is actually done. Although retinal patterns may be altered in cases of diabetes glaucoma, retinal degenerative disorders or cataracts, the retina typically remains unchanged from birth until death. Due to its unique and unchanging nature, the retina appears to be the most precise and reliable biometric. Advocates of retinal scanning have concluded that it is so accurate that its error rate is estimated to be only one in a million.
The blood vessels within the retina absorb light more readily than the surrounding tissue and are easily identified with appropriate lighting. A retinal scan is performed by casting an undetectable ray of low-energy infrared light into a person’s eye as they look through the scanner's eyepiece. This beam of light outlines a circular path on the retina. Because retinal blood vessels are more sensitive to light than the rest of the eye, the amount of reflection fluctuates. The results of the scan are converted to computer code and stored in a database.

The idea for retinal identification was first conceived by Dr. Carleton Simon and Dr. Isodore Goldstein and was published in the New York State Journal of Medicine in 1935 . but in 1976, Robert "Buzz" Hill formed a corporation named EyeDentify, Inc., made a full-time effort to research and develop such a device. In 1978, the idea of a retinal scanner was patented, followed by a practical working prototype in 1981.

WORKING

An iris-recognition algorithm first has to identify the approximately concentric circular outer boundaries of the iris and the pupil in a photo of an eye. The set of pixels covering only the iris is then transformed into a bit pattern that preserves the information that is essential for a statistically meaningful comparison between two iris images. The mathematical methods used resemble those of modern lossy compression algorithms for photographic images.
Suppose in the case of Daugman's algorithms, a Gabour wavelet transform is used in order to extract the spatial frequency range that contains a good best signal-to-noise ratio considering the focus quality of available cameras. The result are a set of complex numbers that carry local amplitude and phase information for the iris image. In Daugman's algorithms, all amplitude information is discarded, and the resulting 2048 bits that represent an iris consist only of the complex sign bits of the Gabor-domain representation of the iris image. Discarding the amplitude information ensures that the template remains largely unaffected by changes in illumination and virtually negligibly by iris color, which contributes significantly to the long-term stability of the biometric template. To authenticate via identification (one-to many template matching) or verification (one-to one template matching) a template created by imaging the iris, is compared to a stored value template in a database. If the Hamming distance is below the decision threshold, a positive identification has effectively been made.
A practical problem of iris recognition is that the iris is usually partially covered by eye lids and eyelashes. In order to reduce the false-reject risk in such cases, additional algorithms are needed to identify the locations of eye lids and eyelashes, and exclude the bits in the resulting code from the comparison operation.

Click here to view the eye movement

Advantages

  • It is an internal organ that is well protected against damage and wear by a highly transparent and sensitive membrane the CORNEA. This distinguishes it from fingerprints, which can be difficult to recognize after years of certain types of manual labor.
  • The iris is mostly flat and its geometric configuration is only controlled by two complementary muscles (the sphincter pupillae and dilator pupillae), which control the diameter of the pupil. This makes the iris shape far more predictable than, for instance, that of the face.
  • Some argue that a focused digital photograph with an iris diameter of about 200 pixels contains much more long-term stable information than a fingerprint.
  • The iris has a fine texture that – like fingerprints – is determined randomly during embryonic gestation. Even genetically identical individuals have completely independent iris textures, whereas DNA (genetic "fingerprinting") is not unique for the about 1.5% of the human population who have a genetically identical monozygoic twin.
  • An iris scan is similar to taking a photograph and can be performed from about 10 cm to a few meters away. There is no need for the person to be identified to touch any equipment that has recently been touched by a stranger, thereby eliminating an objection that has been raised in some cultures against finger-print scanners, where a finger has to touch a surface, or retinal scanning, where the eye can be brought very close to a lens.
  • While there are some medical and surgical procedures that can affect the colour and overall shape of the iris, the fine texture remains remarkably stable over many decades. Some iris identifications have succeeded over a period of about 30 years.
Though this has many Limitations tooo

Limitations


  • Iris recognition is very difficult to perform at a distance and if the person to be identified is not cooperating by holding the head still and looking into the camera.
  • As with other photographic biometric technologies, iris recognition is susceptible to poor image quality.

Advertisements

Send SMS

blogger templates | Make Money Online