Tuesday, August 19, 2014

Bermuda Triangle Mystery



The U. S. Board of Geographic Names does not recognize the Bermuda Triangle as an official name and does not maintain an official file on the area.

The "Bermuda or Devil's Triangle" is an imaginary area located off the southeastern Atlantic coast of the United States, which is noted for a high incidence of unexplained losses of ships, small boats, and aircraft. The apexes of the triangle are generally accepted to be Bermuda, Miami, Fla., and San Juan, Puerto Rico.

In the past, extensive, but futile Coast Guard searches prompted by search and rescue cases such as the disappearances of an entire squadron of TBM Avengers shortly after take off from Fort Lauderdale, Fla., or the traceless sinking of USS Cyclops and Marine Sulphur Queen have lent credence to the popular belief in the mystery and the supernatural qualities of the "Bermuda Triangle."

Countless theories attempting to explain the many disappearances have been offered throughout the history of the area. The most practical seem to be environmental and those citing human error. The majority of disappearances can be attributed to the area's unique environmental features. First, the "Devil's Triangle" is one of the two places on earth that a magnetic compass does point towards true north. Normally it points toward magnetic north. The difference between the two is known as compass variation. The amount of variation changes by as much as 20 degrees as one circumnavigates the earth. If this compass variation or error is not compensated for, a navigator could find himself far off course and in deep trouble.

An area called the "Devil's Sea" by Japanese and Filipino seamen, located off the east coast of Japan, also exhibits the same magnetic characteristics. It is also known for its mysterious disappearances.

Another environmental factor is the character of the Gulf Stream. It is extremely swift and turbulent and can quickly erase any evidence of a disaster. The unpredictable Caribbean-Atlantic weather pattern also plays its role. Sudden local thunder storms and water spouts often spell disaster for pilots and mariners.

Finally, the topography of the ocean floor varies from extensive shoals around the islands to some of the deepest marine trenches in the world. With the interaction of the strong currents over the many reefs the topography is in a state of constant flux and development of new navigational hazards is swift.Not to be under estimated is the human error factor. A large number of pleasure boats travel the waters between Florida's Gold Coast and the Bahamas. All too often, crossings are attempted with too small a boat, insufficient knowledge of the area's hazards, and a lack of good seamanship.


The Coast Guard, and most other official sources are not impressed with supernatural explanations of disasters at sea. It has been their experience that the combined forces of nature and unpredictability of mankind outdo even the most far fetched science fiction many times each year.

Brain–computer interface

Brain-computer interface (BCI) is a collaboration between a brain and a device that enables signals from the brain to direct some external activity, such as control of a cursor or a prosthetic limb. The interface enables a direct communications pathway between the brain and the object to be controlled. In the case of cursor control, for example, the signal is transmitted directly from the brain to the mechanism directing the cursor, rather than taking the normal route through the body's neuromuscular system from the brain to the finger on a mouse.

By reading signals from an array of neurons and using computer chips and programs to translate the signals into action, BCI can enable a person suffering from paralysis to write a book or control a motorized wheelchair or prosthetic limb through thought alone. Current brain-interface devices require deliberate conscious thought; some future applications, such as prosthetic control, are likely to work effortlessly. One of the biggest challenges in developing BCI technology has been the development of electrode devices and/or surgical methods that are minimally invasive. In the traditional BCI model, the brain accepts an implanted mechanical device and controls the device as a natural part of its representation of the body. Much current research is focused on the potential on non-invasive BCI.
At the European Research and Innovation Exhibition in Paris in June 2006, American scientist Peter Brunner composed a message simply by concentrating on a display. Brunner wore a close-fitting (but completely external) cap fitted with a number of electrodes. Electroencephalographic (EEG) activity from Brunner's brain was picked up by the cap's electrodes and the information used, along with software, to identify specific letters or characters for the message.
The BCI Brunner demonstrated is based on a method called the Wadsworth system. Like other EEG-based BCI technologies, the Wadsworth system uses adaptive algorithm s and pattern-matching techniques to facilitate communication. Both user and software are expected to adapt and learn, making the process more efficient with practice.
During the presentation, a message was displayed from an American neurobiologist who uses the system to continue working, despite suffering from amyotrophic lateral sclerosis (Lou Gehrig's disease). Although the scientist can no longer move even his eyes, he was able to send the following e-mail message: "I am a neuroscientist wHo (sic) couldn't work without BCI. I am writing this with my EEG courtesy of the Wadsworth Center Brain-Computer Interface Research Program."
DARPA , the independent research branch of the U.S. Department of Defense that helped fund the Internet, is among the organizations funding research into BCI.

Wednesday, March 21, 2012

College Attendance Management System


Overview

College Attendance Management System (CAMS) is biometric and RFID based comprehensive attendance management system for Colleges. CAMS provide robust, secure and automatic attendance management system for both Students and Staff. CAMS has an inbuilt facility of sending automatic SMS and Email alerts to the Guardians of the students.

The Need

Compliance – All colleges in the U.K. offering courses to non-EU nationals are required to maintain records of student’s attendance, in compliance with the Home Office regulations.
More efficient student attendance – CAMS automates the student and staff attendance hence; reducing irregularities in the attendance process arising due to human error.
Saving Time – Important administrative and educational resources could be freed up by utilizing CAMS
Environment Friendly – Reduces paper and other resource requirements. CAMS with its extremely small carbon footprint is a greener option.
Better Guardian information system – Guardians are better informed about their wards whereabouts with automatic SMS, Bulk SMS and Email facilities in CAMS.
Improved Guardian – College Relationship – CAMS brings the two principal stakeholders in a student’s education closer, considerably improving the Guardian – College rapport, leading to an atmosphere of comfort and trust.


CAMS is built on cutting edge modern technology and designed to help Colleges and Guardians to deal with problems of truancy/absenteeism.

CAMS is built on robust client-server architecture and supports multiple simultaneous clients which enable admin staff to perform their function with utmost ease.

The system consists of the following technology elements as shown in the figure.
Biometric Attendance Devices
Biometric Mobile Devices
Radio Frequency ID (RFID) Tags
CAMS Local Server (CLS)
CAMS Online Server (COS)
CAMS Web Interface
CAMS Software
Bulk SMS facility
Automatic Email Alerts

In a typical CAMS setup there is a biometric device setup outside each class room. These devices are connected with the CAMS local server housed in the College.

Each day the students will register their attendance with the biometric device. The CAMS local server will download the attendance data in real time. The collected data will then be securely synchronized with the CAMS Online Server (COS). COS will process the attendance and send a set SMS message to the Guardians of the absentee student via CAMS SMS gateway server if required and activated. Once the attendance is processed all attendance related reports will be available on COS for print and download.

The reports will be available in PDF, CSV, HTML, Word, Excel, RTF, Text and XML formats. Authorized users can login into CAMS online via web to retrieve the records and reports as required.

Student Registration

Every student has to be registered and activated with the CAMS. While registering the student following information is required

Personal Details
Course Details
Course Duration
Scanned Visa Copy
College Details
Biometric Data

Registration can be done via the web interface provided to the college along with the biometric fingerprint reader. Once activated with the college all devices in the college will accept the biometric identification from the student and his attendance will be marked accordingly against the course his is taking. Once a student has been activated on CAMS the college will be sent a student ID smart card via CAMS card printing service. The Student ID card will feature the following:

Student ID number
College Name and Logo
Student Personal Details
Student Photograph
Course
Validity period

The design and information on student ID cards can be set by the college based on the templates provide by CAMS to the college administration.

Email and SMS

The system also allows the College authorities to send SMS alerts to guardians regarding special events and emergencies.

Automatic Emails can also be enabled where attendance reports can be emailed to the concerned person in a set cycle.

In-built SMS and Email templates are provided to the college. Custom templates can further be created within CAMS according the college’s needs.

Staff Attendance

CAMS also has fully integrated Time Manager Software which provides various employee attendance and HR related functionalities e.g. Shift Scheduling, Shift Rota, Attendance Summary, Leave Management etc. CAMS Time Manager Module can also be integrated with leading payroll processing software.

Secure Login

CAMS considers data security of paramount importance. A secure state-of-the-art biometric login system ‘VAJRA’ is employed to login to the CAMS system.

Users are required to authenticate themselves using biometric fingerprint data before they are granted access to CAMS.

Once logged in, the system maintains an automatic log of all activities carried out by the user within the system.

Features
Accurate Student Attendance
Automatic Attendance Collection
Daily Absentee Report
Automatic SMS alert to guardian of absentee student
Daily attendance Register
Monthly attendance Register
Yearly attendance Report
Attendance Summary Report
Bulk SMS facility for special events and announcements
RFID option
Mobile attendance data collection and reporting
Robust employee attendance system

Benefits


Ensure better compliance to Home Office regulations for attendance monitoring of Foreign Students.

Better student attendance management

Less administrative work

Better accounting of student’s whereabouts during College hours

More parental involvement in ascertaining student presence in College

Improves student attendance ratio

Better College staff attendance management

Data Protection Safeguards

The user’s fingerprints are never stored on the device or the software, only a uniquely identifiable digital data called minutiae is extracted encrypted and securely stored. Hence, there is no personally identifiable data which is stored on the system.

Encrypted data is stored using FIPS 250 compliant encryption.

High quality US FBI approved sensors are used to capture data; these sensors are capable of hardware encryption at source.

All data is stored in a 128 bit encrypted data packet within the system.

Application performs a secured encrypted verification match; this ensures that digital biometric data is never exposed to any environment.

Car stereo Mp3 Player with 2GB Inbuilt Memory




Incredible Car MP3 Music Player FM Transmitter ISO 9001 CERTIFIED

Just arrived, limited quantities of Car MP4 WMA Wireless FM Transmitter. This model works like any normal FM Modulator but offers you the choice of over 206 FM channels to choose from. It is compatible with USB sticks/ MP3 players and SD Cards. The adjustable neck allows you to adjust the LCD display to the best viewing angle. Just insert a USB Flash Drive or a USB Card Reader with Flash Card with MP3 files into this transmitter, Or connect with your MP3, iPod, CD, DVD, Computer player or other audio device via the audio cable, then tune your radio to the appropriate frequency and press play, you could enjoy wonderful music from the car stereo at once.

Liven up your choice of audio in the car: get all that downloaded music off your hard drive and pump it through your car stereo instead!

Manufacturer Specification
FM Frequency Range: 87.5MHz~108MHz
USB: USB 2.0
Power Source: Directly from car cigarette lighter
Built-in FM transmitter with remote control
Reads MP3s from most USB flash MP3 / MP4 Players
2.5 to 3.5 Line In cord to connect the FM transmitter to a 3.5mm Jack audio source

Features:
Car cigar socket power supply
24 Volt Powered Wireless FM Transmission
206 FM Channels
USB Port
SD/MMC Card Reader
Adjustable Angle Control
Compatible with CD/DVD/Flash devices/SD Card
Memory Card -3000 Songs of your choice.
Volume Control
LCD –Show the volume and number of songs.

Accessories:
Remote Control
2.5mm to 3.5mm Audio Cable
or USB cable (depends on Model
)
Car USB-MP3 Transmitter

Performance:
For Flash Memory/Memory stick (USB Only)
1, Plug the MP3 Player FM Modulator into car smoke light jack.
2, Plug your Flash Memory into FM Modulator's USB Port.
3, Using the audio cable, plug your MP3, CD, DVD Player or other audio device into the FM Modulator's stereo jack.
4, Select any FM Modulator channels that a local radio station is not broadcasting on.
5, Tune your car stereo to the same FM Frequency as the FM Modulator.
6, Press the Play Button on the FM Modulator.
7, Music/Audio transmits to vehicle's stereo system.

NEW MODEL
Just arrived, new model with ff features below:

*2gb built in memory - up to 500 Songs
*Expandable up to 4gb with SD memory card
*Can be use also as a flash for files storage

Tuesday, November 9, 2010

Lazy males made to work


The best known insect societies are those of ants, bees and wasps all of which belong to the order Hymenoptera. Individuals in these species organize themselves into colonies consisting of tens to millions of individuals. Each colony is headed by one or a small numbers of fertile queens while the rest of the individuals serve as sterile or nearly sterile workers. The spectacular ecological success of the social insects, their caste differentiation, division of labour and highly developed communication systems are well known. A less studied but equally intriguing aspect of these hymenopteran societies is that they are feminine monarchies – there are queens but no kings and all workers are females. Males do little more than transferring their sperm to virgin queens while all the work involved in nest building, brood care and, finding and processing food is done by the females.

Why don’t males work, at least during the period that they stay on the nests of their birth? Using the Indian primitively eusocial wasp Ropalidia marginata and the important task of feeding larvae as an example of work, we have recently made a novel attempt to understand the secret behind the well-known laziness of the males. We considered three hypotheses:
males are incapable of feeding larvae,
males never get access to enough food to satisfy themselves and have something left over to offer to the larvae (males do not forage on their own and depend on the females for access to food), and
females are so much more efficient at feeding larvae that they leave no opportunities for the relatively inefficient males to do so.

To test these hypotheses, my graduate student Ms. Ruchira Sen offered experimental colonies excess food. This resulted in a marginal amount of feeding of the larvae by males thus disproving the hypothesis that males are incapable of feeding larvae. Then she removed all the females from some colonies and left the males alone with hungry larvae. This experiment was a non-starter because males cannot forage and find food in the absence of females. Ruchira overcame this problem by mastering the art of tenderly and patiently hand-feeding the males. And she gave them more food than they could themselves consume so that they might feed larvae if they could. Her efforts were rewarded when males under these conditions fed larvae at rates nearly comparable to those of the females. Thus males can feed larvae and will do so if they are given an opportunity. It therefore appears that males do not feed larvae under natural circumstances because they do not have access to enough food and/or because females leave them few opportunities to do so. There are several lines of evidence to suggest that the males were not merely dumping unwanted food but that they were actively seeking out the most appropriate larvae and feeding them “deliberately”. But it must be emphasized that from the point of view of the larvae, males were quite inefficient compared to the females. Apart from the fact that males fed only the oldest larvae and ignored all the young larvae, it turned out that many of the larvae under allmale care died.

In addition to their obvious interest, these studies open up a major evolutionary puzzle: why has natural selection not made the males more efficient and made feeding larvae by males a routine matter? Answering one question raises at least one more – and that’s how it should be.

Hard rocks can have long memories


One of the best ways to understand the geological history of our 4500 million year old planet is to study rocks formed under a wide variety of geological conditions. Geologists, equipped with their vast experience and advanced analytical instruments, can identify and interrogate those rocks that best preserve evidence of past geological events. One such instrument is the sensitive high resolution ion microprobe (SHRIMP), a large specialized mass spectrometer that measures the ages of rocks, their precursors and major thermal events by firing a 10,000 volt ion beam at crystals as small as 0.05 mm diameter and measuring the isotopic abundances of the lead, uranium and thorium that are released.

The reconstruction of the continents that existed in the past is an important part of understanding the dynamic evolution of earth. The ancient supercontinent of Gondwana once consisted of what are now the smaller continents of South America, Africa, Madagascar, southern India, Sri Lanka, Antarctica and Australia. Determining the timing of the geological events involving rock formation and modification (deformation, metamorphism etc.) in these continental fragments is vital in piecing together the evolution of the earth's crust during any period of geological time. Most rocks 'forget' their history if exposed to extreme geological conditions, but there are some rare cases where particular rocks derived from the earth's lower crust have preserved, in their distinctive mineralogy, convincing evidence of the very high temperatures that can be present at depth.
The rocks of the central Highland Complex in Sri Lanka, and some parts of Antarctica and southern India, have been subject to some of the highest peak temperatures of crustal metamorphism known, over 1100°C. At such temperatures most rocks would turn into molten magma, but in the November issue of Geology, Sajeev and others report rocks from near Kandy (Sri Lanka) that not only survived the high temperatures, but contain crystals of zircon in which a uranium-lead isotopic record of their provenance and thermal history have survived. Such survival is contrary to all predictions from experimental studies of the rate that lead should be lost from zircon by thermal diffusion.

From a study of the metamorphic minerals and thermodynamic modelling, and SHRIMP uranium-lead isotopic analyses of zircon and monazite (cerium phosphate), the authors have shown that the rocks near Kandy were originally sediments derived from sources ranging in age from 2500 to 830 million years. The sediments were heated to over 1100°C at a depth of about 25 km about 570 million years ago, and then rapidly lifted towards the surface, while still hot, about 550 million years ago. These Sri Lankan rocks were probably trapped and buried in the violent collision between the two halves of the Gondwana supercontinent about 600 million years ago, superheated by basalt magmas rising from the earth's interior, then forced to the near surface again as the tectonic pressures relaxed. The preservation of the isotopic record of these events is remarkable, and still remains to be fully explained.

Wednesday, October 6, 2010

The 15 Most Promising Inventions of 2010
















15. nPower Personal Energy Generator

14. Flying Car: Terrafugia

13. Sony 3D-360 Hologram

12. Xeros Waterless Washing Machine

11. Recompute: The Cardboard Computer

10. Powermat Wireless Battery Charger

09. Samsung Water-Powered Battery

08. 2010 Brabus Mercedes-Benz Viano Lounge

07. V12 Dual-Touchscreen Notebook

06. MyKey by Ford

05. Tri-Specs

04. Google Wave

03. The KS810 Keyboard Scan

02. Apple Tablet

01. Software that Captures Sports Games Robotically

Tuesday, September 7, 2010

Human Teleportation-2




Ever since the wheel was invented more than 5,000 years ago, people have been inventing new ways to travel faster from one point to another. The chariot, bicycle, automobile, airplane and rocket have all been invented to decrease the amount of time we spend getting to our desired destinations. Yet each of these forms of transportation share the same flaw: They require us to cross a physical distance, which can take anywhere from minutes to many hours depending on the starting and ending points.

But what if there were a way to get you from your home to the supermarket without having to use your car, or from your backyard to the International Space Station without having to board a spacecraft? There are scientists working right now on such a method of travel, combining properties of telecommunications and transportation to achieve a system called teleportation. In this article, you will learn about experiments that have actually achieved teleportation with photons, and how we might be able to use teleportation to travel anywhere, at anytime.

Teleportation involves dematerializing an object at one point, and sending the details of that object's precise atomic configuration to another location, where it will be reconstructed. What this means is that time and space could be eliminated from travel -- we could be transported to any location instantly, without actually crossing a physical distance.

Many of us were introduced to the idea of teleportation, and other futuristic technologies, by the short-lived Star Trek television series (1966-69) based on tales written by Gene Roddenberry. Viewers watched in amazement as Captain Kirk, Spock, Dr. McCoy and others beamed down to the planets they encountered on their journeys through the universe.

In 1993, the idea of teleportation moved out of the realm of science fiction and into the world of theoretical possibility. It was then that physicist Charles Bennett and a team of researchers at IBM confirmed that quantum teleportation was possible, but only if the original object being teleported was destroyed. This revelation, first announced by Bennett at an annual meeting of the American Physical Society in March 1993, was followed by a report on his findings in the March 29, 1993 issue of Physical Review Letters. Since that time, experiments using photons have proven that quantum teleportation is in fact possible

Human Teleportation-1




We are years away from the development of a teleportation machine like the transporter room on Star Trek's Enterprise spaceship. The laws of physics may even make it impossible to create a transporter that enables a person to be sent instantaneously to another location, which would require travel at the speed of light.

For a person to be transported, a machine would have to be built that can pinpoint and analyze all of the 1028 atoms that make up the human body. That's more than a trillion trillion atoms. This machine would then have to send this information to another location, where the person's body would be reconstructed with exact precision. Molecules couldn't be even a millimeter out of place, lest the person arrive with some severe neurological or physiological defect.


In the Star Trek episodes, and the spin-off series that followed it, teleportation was performed by a machine called a transporter. This was basically a platform that the characters stood on, while Scotty adjusted switches on the transporter room control boards. The transporter machine then locked onto each atom of each person on the platform, and used a transporter carrier wave to transmit those molecules to wherever the crew wanted to go. Viewers watching at home witnessed Captain Kirk and his crew dissolving into a shiny glitter before disappearing, rematerializing instantly on some distant planet.

If such a machine were possible, it's unlikely that the person being transported would actually be "transported." It would work more like a fax machine -- a duplicate of the person would be made at the receiving end, but with much greater precision than a fax machine. But what would happen to the original? One theory suggests that teleportation would combine genetic cloning with digitization.

In this biodigital cloning, tele-travelers would have to die, in a sense. Their original mind and body would no longer exist. Instead, their atomic structure would be recreated in another location, and digitization would recreate the travelers' memories, emotions, hopes and dreams. So the travelers would still exist, but they would do so in a new body, of the same atomic structure as the original body, programmed with the same information.

But like all technologies, scientists are sure to continue to improve upon the ideas of teleportation, to the point that we may one day be able to avoid such harsh methods. One day, one of your descendents could finish up a work day at a space office above some far away planet in a galaxy many light years from Earth, tell his or her wristwatch that it's time to beam home for dinner on planet X below and sit down at the dinner table as soon as the words leave his mouth.

Thursday, August 26, 2010

Thermal Design Power (TDP)




To understand power management, it's important to fully appreciate the ways designers deal with average and peak power. Most of this article will focus on how average power can be reduced, but there are also some interesting power management techniques to handle the case of peak power consumption. TDP is a measure of how much power needs to be dissipated by the cooling solution when the CPU is running the maximum software workload that would be expected in normal operating conditions. (With specialized test code, a CPU could generate even more heat.)
More specifically, the CPU manufacturers calculate TDP as the amount of heat that needs to be transferred from the processor die in order to keep the transistor junction temperature (Tj) below the maximum for which the device is guaranteed to operate (Tj is usually 100 degrees centigrade or lower, but note that things are actually much more complicated. Some vendors will often specify a die "case" temperature as low as 70 degrees centigrade in order to get high clock rates. That's why some desktop heat sinks are enormous.)

How that heat gets removed is part of the system thermal design and can be accomplished by heat sinks, fans, and air vents. In a mobile device, a large portion of the heat is conductively transferred through the system chassis—and then onto your lap, highlighting one of the limitations with using a CPU that has a high TDP value. Many laptops use CPUs with a TDP of 30 or more watts. These are easily identified by the fans in the case and the short amount of time you'd actually want the machine on your lap. Note that multicore CPUs make this problem even worse, since the TDP and cooling solutions are based on all cores running simultaneously

CPU's Protect Themselves from Killer Heat

Before the CPU die can exceed the maximum junction temperature, on-chip thermal sensors signal special circuitry to lower the temperature. Over the years, CPUs have incorporated several mechanisms for measuring and controlling temperature. An on-chip thermal diode allows an external analog-to-digital converter to monitor temperature. Basically, the diode current changes as the chip heats up, allowing the system microcontroller to measure the voltage difference and take action to lower the temperature.
The system vendors program the microcontroller with temperature control algorithms to speed up fans, throttle the CPU, etc. In some designs, the CPU will run its own BIOS code to control temperature. However, CPU designers were worried about chip damage if the external microcontroller were to fail. Also, some of the thermal spikes happen so rapidly that it was possible to exceed maximum die temperature before the system could respond. Additional on-chip temperature sensors have been added, directly controlling digital logic that automatically reduces CPU performance and temperature. If for some reason the CPU temperature keeps rising, eventually it reaches a critical condition, and hardware signals the power supply to shut down completely.

Sometimes you'll see references to Thermal Monitor 1 (TM1) and Thermal Monitor 2 (TM2). These are mechanisms used by the CPU to quickly reduce performance and get an accompanying drop in power consumption. TM1 is an older technology and simply inserts idle cycles to effectively halve the pipeline frequency, even though the clock signal continues to run at the same frequency. This is a dramatic drop in performance for a linear drop in power consumption.

TM2 uses dynamic voltage scaling (DVS) techniques to reduce the clock frequency and then signal the external voltage regulator to shift to a lower voltage. The power supply voltage won't drop instantaneously because of capacitance. However, voltage reduction has the biggest impact on temperature, since power varies by the square of voltage. We'll talk more about dynamic voltage scaling, since it is a key power management technique that helps reduce average power consumption. There are differences in the algorithms used by the various CPU vendors for how they throttle clock rate and voltage to keep the die below maximum temperature.

Monday, July 19, 2010

System Cloning Overview



Windows XP Embedded includes the System Cloning Tool component. The system cloning process is used during manufacture to ensure that each device has a run-time image containing a unique computer security ID (SID) and computer name.

If each device undergoes the stand-alone First Boot Agent (FBA) process separately, cloning is not required. However, the stand-alone FBA process is time-consuming and therefore impractical in a typical production environment.

If you simply copied the same post-FBA image to every device, every device would share the same computer SID. This presents a problem because every computer running Windows XP is required to have a unique computer SID. The solution is to include the System Cloning Tool component in your run-time image.

The cloning process consists of the following two phases:

Reseal phase
The reseal phase occurs on the device, which is called the master because the image created on it will be the cloned image. Typically, the reseal phase occurs just before the reboot that precedes the cloning phase; however, additional operations can occur between the reseal phase and the device reboot. After the reseal phase has completed, you must immediately shut off the device before the subsequent reboot would typically occur. At this time, the on-disk image is ready for cloning. For more information, see Reseal Phase.

Cloning phase
The cloning phase automatically begins the first time the image boots after the reseal phase, unless you set the extended property cmiResealPhase to 0 in Target Designer. Typically, this occurs after the on-disk image from the master has been copied to another device, or the clone. The clone device picks up where the master device has left off after the reseal phase. During the cloning phase, the computer SID from the master device is replaced with a unique computer SID everywhere the SID appears. This makes each clone unique where it is required but identical to the master everywhere else. The following illustration shows an overview of the cloning process.



During the cloning phase, you see a message in the Windows XP boot monitor stating that Windows is starting. This message notifies you that the cloning process is working. The amount of time spent in this phase depends on the size of the image and whether it is a FAT or NTFS file system. An image on an NTFS file system partition will take longer to clone because the NTFS file system uses SIDs to control access to each file system object using access control lists (ACLs).

What are some lessons that life teaches you

Past  can not be changed. Opinions  don't define your reality. Everyone's  journey  is different. Judgements  are a confessi...