Future E-Toll Style Technologies Challenge Economy and Ethics
South Africa is arguably among the fastest nations in the developing world to adopt new or emerging digital technologies.
The nation’s rapid acceptance of digital technologies speaks to its desire to develop quickly. Its early adoption of new tools has helped it leapfrog other developing nations, technologically.
However, this early acceptance also potentially raises as many problems as it solves, on the ethics, economics, business and mental health fronts.
The growth of South Africa’s mobile phone market illustrates its hunger for new technologies.
Across the African continent as a whole, more people now have access to mobile phones than have access to fresh water. There are more than 650 million mobile phone subscriptions in Africa. In South Africa, a 2011 Nielsen study showed that more people use mobile phones than use computers, TVs, radios or landline telephones.
Meanwhile, South Africans have, by and large, accepted with little fuss the introduction of chip-enhanced ID cards. These are of the type accepted in parts of Europe but rejected in the UK because of privacy and security concerns.
However, while bringing benefits, this approach may also see South Africa too quickly adopt such things as under-the-skin payment devices – like those foreseen in a recent international study by Visa.
The recent introduction of e-toll machines on main roads in South Africa inspired heated public debate. Yet the debates have focused more on the fees levied than the near-field-technologies employed; the same basic technologies behind subcutaneous chips.
Of course, the use of a basic technology in one field should not necessarily represent marks against in another. Neither should it, though, give that technology a free pass when it comes to its adoption in more invasive ways. Each case must be taken on its merits or lack thereof.
On one level, implants are old news. Several groups overseas have experimented with implanted Radio Frequency Identification Devices (RFIDs).
Most recently, the Epicentre high-tech office block in Sweden has encouraged its workers to undergo chip implantation which allows them to open security doors, run office machinery and even pay for their lunch at the work canteen.
Supporters of a more widespread adoption of biochip payments will argue that they represent a simple and logical extension of the current trend toward contactless payment cards.
These are becoming mainstream in countries like Australia and the 2012 London Olympics was the first global event to run almost exclusively on wave-and-pay systems.
Yet while the notion of implanted chips may seem a convenient way to perform all manner of everyday activities, if we dig a little deeper we see that the promise may exceed the benefits.
Hackable and Trackable
Firstly, we should consider the very real potential chip payments represent for bio-hacking. Any programmable device is in theory subject to hacking. A biochip can be hacked by third parties with nefarious motives in the same say that a computer system can be invaded.
With biochips, the potential for privacy incursions is huge – and not only with regard to society’s criminal elements. Recent debates about the work of official security agencies have highlighted public concerns about spying by governments on their citizens.
The trust contract between government and citizenry is a cornerstone of liberal democracy. Once this compact is broken, anarchy becomes a very real possibility. Current misgivings about privacy intrusions by officials are hardly likely to be allayed if we insert tracking devices into our bodies.
On the business and civic fronts, Big Data Analytics is proving a great boon.
Sophisticated mobile devices such as smartphones, satnavs and CCTV units make it possible for us to collect and generate data at unprecedented rates. Every day, the global community adds 2.5 billion gigabytes to the database we call the internet.
Super-computers, like IBM’s famous Watson machine, allow the speedy analysis of all this information and the discovery of patterns within it.
This analysis is used to predict such things as economic shifts, marketing trends and even political voter patterns. Big Data is now proving invaluable in the design of furniture, buildings, streets, driverless cars and even entire cities. Civic authorities are consulting it in the development of new crime prevention programmes and prisons.
For all its benefits, however, Big Data remains a form of soft surveillance.
The Samsung company recently warned users of its Smart TVs that the inbuilt voice recognition feature allows private conversations to be recorded and stored in the Cloud. These private conversations are then accessible to third parties. George Orwell would have loved that.
This is perhaps an isolated incident and Big Data provides too many benefits for us to attempt to turn back the clock. However, the hacking of biochips would render personal privacy even less intrusion-proof.
Implants may appear convenient, but we must consider whether or not we want our bodies to become hackable devices.
Internally ‘worn’ chips also raise other ethical considerations.
One of the most pertinent relates to the line between humanity and technology. As we use nanorobotics and bio-mechanical chips to inject a new breed of prosthetic devices into the human frame, will we lose our sense of differentiation between what is human and what is machine? At what point then might we truly become androids?
Far from being frivolous questions for sci-fi aficionados, these are now subjects undergoing serious debate in major universities – particularly in fast-growing ethics faculties. (What was sci-fi yesterday becomes wi-fi tomorrow.)
Commercially-oriented chip implants also raise questions relating to digital debt. The growing number of charities and social enterprises devoted to helping the indebted bear witness to what is already a rapidly spreading problem in modern societies.
The uncoupling of spending from physical cash has doubtless played a key role boosting personal debt.
Paper money and coinage have substance and weight; it is relatively easy to keep track of how much we spend when our money has a physical presence. We know it’s time to wind back on impulse purchases when the wad of cash in our pockets starts to feel a little on the light side.
Credit cards do not change weight when money leaves our accounts. At least, though, the process of filling out a credit card slip – now less and less a part of purchasing, thanks to wave and pay – provides some kind of physical reminder, albeit a tenuous one, that purchases cost us something of real-world value.
The advent of digital currencies such as bitcoin creates a potential for even greater overspending. The ones and zeroes of binary code have no weight at all. Implanted chips will continue to erode the link in human consciousness between spending and real-world value.
Arguably, payment companies like Visa have little interest in this problem. There are real benefits for them in divorcing the act of a consumer’s spending from any process of forethought.
Subcutaneous spending devices also raise the potential for digital dementia. In 2011, an international study concluded, after ten years of investigation, that the onset of dementia begins at around the age of 45, rather than 65 as was previously believed.
At the 2020Plus think tank, we posed an important question linked to this study. If a similar ten-year scientific investigation commenced today, would we find at its conclusion that things we associate with dementia in 2015 had now become normal cognitive function?
Would loss of short-term memory, numeracy skills and feelings of confusion have ceased to be peculiar because we had ceded so many areas of our thinking to machines?
We already rely on gadgets for arithmetic, spelling, navigation and, increasingly, person-to-person interaction. What happens to the parts of our brains responsible for these and other activities if they are no longer called upon on a regular basis?
A few weeks ago, a leading British psychiatrist suggested that children as young as five years of age are exhibiting borderline autism-like symptoms. They are, he said, unable to read the subtle facial signals in normal human conversation because of their engagement with digital screens.
A range of studies, particularly in the USA, suggests that we are forming transactional relationships with machines. We do not remember what we learn on the internet as much as we remember where we found it, relying on the machine to store the details.
This of course means that what we read is not stored in long-term human memory and provides no benefit for producing future innovation.
The experimental research of leading neuroscientists such as Baroness Susan Greenfield is building the case for watchfulness when it comes to relying too much on digital devices.
Implants also raise important health issues. Research is still ongoing into the impact of chips on the development of certain cancers. To this point, studies have only been carried out on laboratory animals. Yet even now, as the Australian newspaper reported recently, they point to links between chip implants and cancerous growths.
Business and Biochips
On the business front, chips offer attractive opportunities for the lowering of transaction fees and the opportunity for sales growth, as people find waving a hand over a transaction pad easier even than waving an RFID credit card.
However, here too there are important questions to be answered – and more than a few potential pitfalls for the business community.
There are, for example, questions of ownership. Who would own the chip and the data stored on it or generated by it?
This is a vexed quandary which the European Court of Justice has already attempted to answer in its so-called "right to be forgotten" ruling. However, as this ruling relies on the cooperation of data companies like Google, it is hardly an iron-clad answer to the ownership question.
This question will be tested and retested many times in courts over the next few years, both locally and internationally. It will undoubtedly produce a rich harvest for corporate lawyers, accompanied by vast payouts from corporations.
Chip malfunction also presents a potential minefield for both regulatory bodies and corporate groups. Replacing a credit card is a relatively painless, if sometimes frustrating, experience for the consumer.
With subcutaneous chips, replacing a payment device becomes a medical procedure. Even the simplest of these carries a risk of infection or malpractice. Insurance costs to companies would therefore be considerable.
Providing adequate security for personal data would also present unique challenges. The theft or leaking of personal data downloaded from connected biochips would represent huge brand and legal challenges for even the largest company.
The dangers presented by ID theft will also rise, particularly online. That is, unless the payment biochip also functions as a person's sole means of access to the Internet, operating as a form of password for third party devices. In that event, however, chip theft would become an even bigger threat.
Meanwhile, the use of payment biochips would further depersonalise the relationship between business and consumer.
Already, in the wake of the digital revolution, professional marketing has recognized the need for a fresh approach to consumer relations. This new approach is based on developing personal conversations with consumers, as opposed to pitching products within carefully studied demographics.
In times of uncertainty and rapid change, people want to feel that they have a personal connection with companies, especially those they engage with on a regular basis. They want to know that they have a friend at the bank and not simply an account manager.
Consumers look for companies that will, through such things as social media, engage them in ongoing conversations that grow over time. As the conversation grows, customers expect to be offered new opportunities, or premium deals, in the same way that online gamers expect to move to higher levels of play over time.
The use of subcutaneous payment chips would undoubtedly leave many consumers feeling that, far from being recognized as individuals, they are being depersonalised, as the human frame itself is commodified.
A common perception would be that their very bodies have become sources of information for sale, the ultimate form of bio-currency.
For the corporate world, another by-product of this would be a loss of kudos for investments made in CSR.
Any public goodwill achieved would potentially be cancelled out by a perception that big data has truly become big brother.
A sense of goodwill within companies might also suffer. Arguably, this is already a problem for those companies who propose a more extensive use of biometric passwords in the workspace.
Protecting against corporate espionage or the leakage of sensitive customer information, would become a huge cost factor, both in financial and psychological terms. Already, 70 percent of U.S. companies use deep packet encryption (DPE) technology to monitor outgoing emails from their employees.
This is the same technology the NSA and other security agencies employ in tracking private email accounts, in the search for key words related to terror activities. Thirty percent of U.S. companies use this technology to track individual keystrokes made by the employees.
Some companies argue that this is a sensible practice; it offers protection from costly lawsuits and the loss of competitive advantage. However, it is hardly likely to engender a sense of trust between the employer and the employee.
This trust - not the dollar value of the company - is the core currency of any business. Without it, employees lack the confidence to be innovative and the motivation to be productive and customers lack incentives to invest time and money.
The wealth gap between society's richest and poorest members would also be exacerbated by the introduction of pain and biochips. Presumably, the expense of installing and maintaining them would necessitate, at least in the beginning, that only the most financially viable would be eligible.
Arguably, this would deny important lifestyle opportunities to those who fall below that line of acceptability. This would be especially true if, as seems likely, the role of biochips were to be extended beyond their use as payment devices to that of replacement passports and so on.
Already, campaign groups such as Stop the Cyborgs are calling for government regulation of the technologies behind Google Glass and similar augmented reality tools. They argue, not unconvincingly, that the opportunities afforded by these devices - across so many areas of personal and professional life - create such extra value for their owners that those who cannot afford them will suffer significantly in comparison.
Technology Creep
Payment biochips may also create new forms of “technology creep”, where the use of tools by government or business gradually exceeds the boundaries originally set by regulation - and accepted by the public.
This has already occured in the UK with reference to CCTV cameras. London is well known for having the highest incidence of cameras per capita in Europe.
When introduced, cameras were presented as a way to reduce the levels of violent crime in the capital. However, over time their use has grown to include photographing people as opposed to cars.
They are now also used to record the number plates of parents who briefly double park outside their children's schools during the afternoon school run.
The cost of technology creep in the above instance is nothing to compare with the likely cost to public privacy and security if the use of payment biochips were altered to include wider activities.
Overall, introducing payment biochips would also require highly sophisticated, inbuilt citizen/consumer controls. These may well prove to be vastly more difficult and costly to achieve than any other system introduced to date. They would open huge ethical and legal minefields for corporate groups and for regulators.
Technology is to be celebrated. There is no point in taking a luddite approach. Digital technologies have brought and will bring enormous benefits to the human experience.
That fact should not, however, make us oblivious to the potential pitfalls associated with making devices an extension – or an integral part – of the human frame.
|