Issues Magazine

Barriers to Technology Development

By Greg Adamson

For a technology to succeed, it has to prove its value, but many valuable technologies that could help society still fail.

According to the concept of the singularity developed by science fiction writer and mathematician Vernor Vinge, in the future society will reach a moment when the immense accumulation of advanced technology totally transforms humanity. Are we on track for that? Are we approaching a world of computers that can be controlled by our thoughts and deliver our dreams? Are we even moving towards the effective use of technology to solve the world’s problems? Or are more advanced computers just choking to drowning in a sea of functions?

The evidence is not comforting. Even when we have the know-how, we often fail to deliver effective technology. Here are two simple examples where we have the technical knowledge to solve a problem but the problem persists:

  • shelter: humans discovered how to build shelters to protect themselves from the elements tens or hundreds of thousands of years ago, yet in most major cities in the world today there are homeless people who lack shelter; and
  • safe drinking water and sanitation: the key technical discoveries to provide safe drinking water and sanitation were known by the early 20th century, yet today according to the United Nations 2.6 billion people lack these benefits.

Singularity is about the future, and predicting the future is notoriously hazardous. What we do know now is that many technologies that look promising, and would appear to benefit the world as a whole, either fail or are strictly limited in their results. What stops great technology? It can be something as simple as confusion or as complex as human motivation.
Consider five causes of failure: prohibition, intolerance, secrecy, greed and confusion.


If creativity is an essential “plus” in developing technology that contributes to the needs of the world, an explicit prohibition banning the development of particular technology, especially if backed up by imprisonment of technologists, is a pretty strong disincentive.

For instance, Russian computer programmer and PhD student Dmitri Sklyarov was imprisoned in the US after addressing a software conference in Los Vegas in 2001. He had presented a paper identifying weaknesses in the e-book protection software of a major IT company. While that company had originally filed a complaint against him with the FBI, it later stepped back in the face of widespread public criticism. The FBI case was rejected by a US jury the following year.

The case involved a controversial piece of legislation, the US Digital Millennium Copyright Act (DMCA). This legislation makes it illegal to undertake research, criticise or hold public discussion into research about the methods used by media companies to limit the use (including copying) of music, video and other “intellectual property”. The effect of prohibiting such research and discussion is to encourage the development of poor security technology, based on the hope that its weaknesses won’t become widely known.

Some security technologies are very poor indeed. For example, one widely adopted security system to prevent music CDs from being played on computers could be overcome by simply drawing a line around the edge of the CD with a felt pen (because the computer then couldn’t read the instructions not to play the CD, which were found near the rim of the CD).

Singularity is about the future, and predicting the future is notoriously hazardous. What we do know now is that many technologies that look promising, and would appear to benefit the world as a whole, either fail or are strictly limited in their results.

Instead of working on more reliable security, or better still adopting a system that didn’t consider the end user a hostile party, DMCA encouraged legal action against anyone who mentioned felt pens and CDs, or equivalent approaches to other security solutions. In the security community, this approach is known as “security through obscurity”. It is generally frowned upon, because it creates a false sense of security that can be easily shattered. Widely adopted security systems today will generally be ones that have been tested by the security community without being broken.


Engineers are human, and have personal lives like everyone else. Great engineering requires great creativity, and a society that suppresses creativity is limiting the capacity of its engineers to deliver great engineering.

One of the technical greats of the 20th century was Alan Turing. Turing’s name today is preserved in some well-known ways: one is the Association for Computing Machinery’s Turing Award, which has been described as “the Nobel Prize of computing”. Another is the Turing test, a measure of computer intelligence based on whether a computer can fool a human into believing they are communicating with another human.

Alan Turing was a brilliant mathematician who made many fundamental contributions to the computing field. He was also key to the British code-breaking effort directed against the German Enigma machine during World War II. After the war, Turing’s defence work was classified, although he was still a renowned computing innovator. He was also homosexual, a crime in Britain at that time. In 2009, 65 years after the end of the war in Europe, British Prime Minister Gordon Brown issued a public apology to Turing:

Turing was a quite brilliant mathematician, most famous for his work on breaking the German Enigma codes. It is no exaggeration to say that, without his outstanding contribution, the history of World War Two could well have been very different… The debt of gratitude he is owed makes it all the more horrifying, therefore, that he was treated so inhumanely. In 1952, he was convicted of “gross indecency” – in effect, tried for being gay. His sentence – and he was faced with the miserable choice of this or prison – was chemical castration by a series of injections of female hormones. He took his own life just two years later…

It is thanks to men and women who were totally committed to fighting fascism, people like Alan Turing, that the horrors of the Holocaust and of total war are part of Europe’s history and not Europe’s present. So on behalf of the British government, and all those who live freely thanks to Alan’s work I am very proud to say: we’re sorry, you deserved so much better.


Research can be undertaken and be successful, but the discovery can then disappear. In some cases a technology patent can be purchased by a company that controls an alternative technology and wants to avoid competition. Norbert Wiener, the inventor of cybernetics (the “cyber” in cyberspace), described this situation in relation to a patent he sold to a large telecommunications company in the 1920s.

In a separate situation, a discovery might be deemed to have military value and disappear from the public eye. A widely discussed case of this type relates to “spread spectrum” technology, a form of data communication. Data communication is the method by which “information” is communicated via a “channel”. For example, the information in a telephone call is your voice, and the channel could be either a landline or a wireless telephone connection. For radio, the information is the radio program and the channel is the FM or AM frequency of the electromagnetic spectrum used by a particular radio station. The general rule of thumb is that, for any particular channel, the information signal needs to be stronger than any unwanted noise. Spread spectrum is a very clever way of sending a signal that can’t be distinguished from the background noise unless you know what you are looking for.

A particular form of spread spectrum was developed and patented by actor Hedy Lamarr and pianist George Antheil in 1942. In military tests during World War II, this method was highly effective in noisy environments, and the invention disappeared under the cloak of military secrecy.

More than three decades later, spread spectrum technology once again became the subject of public development, and delivered a breakthrough in wireless technology. Many important technologies from mobile phones to GPS incorporate spread spectrum technology now that the technology has been allowed to develop.


In other cases, greed is able to overwhelm the beneficial purpose of a technology. Leading business theorist Michael Porter has developed a theory of “shared value”. For him, the need for health, better housing, improved nutrition, help for the ageing, greater financial security and less environmental damage represent the largest unfulfilled set of demands in the global economy today. Unfortunately, it may be more profitable to do nothing rather than to work to meet these needs.

An extreme example is the “patent troll”, a company that buys large numbers of technology patents cheaply and then looks for opportunities to sue other companies for breach of these patents. The stated aim of the patent system is to reward creativity by providing a monopoly on a new invention for some period. In practice a patent may be nothing more than a “ticket to litigation”.

For more than 100 years technology has been so complicated that almost any new invention can be subject to claims of patent infringement. Major technology companies today hold portfolios of defensive patents to use in negotiation with other companies as they develop their products. The patent troll, however, is not vulnerable to such counter-threats as they produce nothing themselves. The effect is to entangle technology development while providing nothing of value.


Finally, I will turn to the experience of engineering itself. One tends to think of engineering as an exact profession, and this is one of the behaviours that an engineering undergraduate learns. We are both precise and exact, and the difference between precision (how many decimal places in an answer) and accuracy (how close to a desired result) is drilled into us. This is expected of us. We design bridges, build life-critical medical equipment, create computer systems with high requirements for accuracy and analyse situations to work out how to address critical problems.

Yet engineering is not infallible, even extremely expensive and well-planned engineering activities such as space exploration. On 11 December 1998 the Mars Climate Orbiter was launched to investigate Mars. Its complex systems had all been functioning normally until, in the words of the NASA report, “an abrupt loss of mission shortly after the start of the Mars Orbit Insertion burn on September 23, 1999.”

On that date, the program suffered a complete loss. What was meant to happen was the commencement of orbit around Mars. In the previous week the distance from the surface had reduced to around 160 km, and on the day of the disaster the distance was calculated at 110 km. The minimum distance considered survivable is 80 km. It is worth reading the technical detail of the NASA report at this point. I have highlighted the critical point at the end:

The MOI [Mars Orbital Insertion] engine start occurred at 09:00:46 (UTC) on September 23, 1999. All systems performed nominally until Mars’s occultation loss of signal at 09:04:52 (UTC), which occurred 49 seconds earlier than predicted. Signal was not reacquired following the 21 minute predicted occultation interval. ... On September 29, 1999, it was discovered that the small forces DV’s [velocity change] reported by the spacecraft engineers for use in orbit determination solutions was low by a factor of 4.45 (1 pound force = 4.45 Newtons) because the impulse bit data contained in the AMD file was delivered in lb-sec instead of the specified and expected units of Newton-sec.

Rocket science is universally recognised as difficult (usually in the negative, as in “it isn’t rocket science”). Yet in this case the teams involved in the project didn’t realise that they were using two incompatible measuring systems, metric and imperial, and a nine-month journey came to a crashing halt.


These five examples are by no means all the barriers that block the development of socially beneficial technology, but they give a flavour of the challenges that any technology-based social improvement will face.

In Peace Wars, Vernor Vinge describes concerted effort by technologists and others to break through the stultifying barriers of a ruling military caste. In the real world today, those interested in the contribution that technology can make to address world problems also need to think about barriers. That isn’t usually top-of-mind for engineers.

Yet the history of technology benefiting humanity is a history of technologists working to overcome the barriers they faced. Supporters who helped Dmitri Sklyarov, the many who campaigned for justice for Alan Turing, historians who revived Lamarr and Antheil’s memory, advocates of the use of technology for social benefit rather than a source of non-productive profit, engineers who work towards technical standards – these are examples that can guide and inspire us today.