The "chipping" of two workers with RFIDs -- radio frequency identification tags as long as two grains of rice, as thick as a toothpick -- was merely a way of restricting access to vaults that held sensitive data and images for police departments, a layer of security beyond key cards and clearance codes, the company said.
"To protect high-end secure data, you use more sophisticated techniques," Sean Darks, chief executive of the Cincinnati-based company, said. He compared chip implants to retina scans or fingerprinting. "There's a reader outside the door; you walk up to the reader, put your arm under it, and it opens the door."
Innocuous? Maybe.
But the news that Americans had, for the first time, been injected with electronic identifiers to perform their jobs fired up a debate over the proliferation of ever-more-precise tracking technologies and their ability to erode privacy in the digital age.
To some, the microchip was a wondrous invention -- a high-tech helper that could increase security at nuclear plants and military bases, help authorities identify wandering Alzheimer's patients, allow consumers to buy their groceries, literally, with the wave of a chipped hand.
To others, the notion of tagging people was Orwellian, a departure from centuries of history and tradition in which people had the right to go and do as they pleased without being tracked, unless they were harming someone else.
Chipping, these critics said, might start with Alzheimer's patients or Army Rangers, but would eventually be suggested for convicts, then parolees, then sex offenders, then illegal aliens -- until one day, a majority of Americans, falling into one category or another, would find themselves electronically tagged.