Theresa May confirms to exit as PM on June 7
24 May 2019 – 15:42 | No Comment

After the UK Parliament rejected her Brexit plans for the third time, UK Prime Minister Theresa May has decided to step down as leader of the Conservative Party.
She announced her departure after talks with Graham …

Read the full story »

Energy & Environment

Circular Economy

Climate Change


Home » Development

Technology Delusions in International Development

Submitted by on 25 Nov 2013 – 16:50

By Kentaro Toyama, Researcher, UC Berkeley, School of Information

In 2004, I moved to Bangalore, India, to found a new research effort for Microsoft. Our goal was to explore how electronic technologies could contribute to the socio-economic growth of the world’s poorest communities. We spent months in remote rural villages and urban slums, and we immersed ourselves in the lives of rice farmers and domestic servants.

At the time, the international development community was excited about applying personal computers and the Internet to address challenges in smallholder agriculture, rural healthcare, and public education. A colleague at the renowned Indian Institute of Technology aimed to double rural incomes through “telecenters” – Internet cafés with a social mission. Internationally, Kofi Annan, then the Secretary-General of the United Nations, hailed the arrival of the low-cost “One Laptop Per Child”: “These robust and versatile machines will enable kids to become more active in their own learning.”

I am a computer scientist by training, and nothing would have pleased me more than to proclaim technology’s ability to transform global poverty. Yet, after six years of research in information and communication technologies (ICT) for international development, I have come to an altogether different conclusion: That as much as technology can support development efforts, it rarely does in practice. A good understanding of this failure is critical to the optimal use of technology and for the larger question of development priorities.

A Brief History of ICT in Development

“What if the full power and vividness of [technology X] were to be used to help the schools develop a country’s new educational pattern? What if the full persuasive and instructional power of [technology X] were to be used in support of community development and the modernization of farming?”

These questions were asked by Wilbur Schramm – the father of communication studies – in a book jointly published by UNESCO and Stanford University in 1964. For Schramm of course, “technology X” was the television.

In hindsight, it seems quaint, even slightly absurd, to have expected television to do much for development. A half-century of television has shown us that whatever the technology’s potential, what dominates the airwaves are sitcoms and reality TV. Public programming is constrained by budgets and drowned out by commercial television; and commercial television is a race to the bottom for the mass market. Meanwhile, even deliberate attempts to use TV for productive purposes only occasionally achieve their goals. Schramm himself investigated an attempt in the Samoan Islands to use television as the basis of their educational system and found it deeply flawed. Within a couple of years of the programme’s inception, teachers, parents, and even students clamored for change – they knew they weren’t learning much.

Today, few people suggest that television is a significant force for international development, but we have new technologies and therefore new hopes. NGOs were excited about personal computers in the mid-1990s. In the early 2000s, it was telecenters. Today, the mantle has passed on to mobile phones and social media. It’s impossible to walk the halls of the WHO without hearing about mHealth, and just last month, Facebook founder Mark Zuckerberg announced – an effort to increase Internet availability in the developing world for the sake of “driving humanity forward.”

Each new technology offers new promise. Television exceeded radio with moving imagery; computers allow interactivity; mobile phones offer two-way communication; and social media raises the bar with multi-way co-ordination. One day, we may have direct brain-to-brain quantum transmitters, but whatever the technology, the hard challenges of development will remain just as they always have been.

The Limits of Technology

In my research group in India, one of our projects sought to help the women of a south Bangalore slum. Their primary source of income was informal domestic work: They cleaned toilets, mopped floors, and cooked meals for urban middle-class families. They found jobs by word of mouth, and they were always looking for more work.

The situation seemed ripe for a technology solution. Potential employers had access to the Internet at home or at work, and a local community center had a couple of underused computers. We decided to build a job-search kiosk in the spirit of LinkedIn or

We focused our efforts on the user interface. We drafted and re-drafted graphics which made sense to non-literate users. We produced short video clips that explained how the kiosks worked. And after several months of prototyping and testing, we had a system that the women could navigate to find jobs. The technology worked.

But then, the real challenges came. In order to build up a database of jobs and workers, we had to go door to door, signing up women interested in work and canvassing apartment complexes for households seeking domestic labor. Potential employers had idiosyncratic expectations. Some mentioned caste-based constraints. Others sought specific cooking skills. Still others didn’t want workers to use their toilets. Meanwhile, many of the workers lacked standards of basic professionalism: They couldn’t commit to specific times of day. They’d miss appointments without notice. They sometimes requested advance payment for personal reasons.

To address these problems, we drafted standardized contracts. We trained workers in cooking, cleaning, and work etiquette. We hired full-time staff to manage client and worker interactions. Pretty soon, the original job-search kiosk ceased to matter. It was the least of the components needed to make the whole system work. Our costs didn’t seem worth the outcomes, so after trying hard for two years, we shut down the effort. A website – no matter how easy to use – didn’t address the hard parts of the problem.

We were not the only ones in India trying to match informal workers with employers. There are similar projects across the country, a few of which I have since become familiar with. LabourNet sets up job search and training centers. uses mobile phones and informal social networks to identify clients. What separates the successes from the failures, however, has little to do with the technology. Instead, it has everything to do with how well the organizations behind them perform basic tasks – effective management, aggressive recruiting, choice of clientele, and ability to sign-up employers. Technology can help a strong organization do its work better; and it can help capable workers find more work; but it’s not the cause itself of efficient management or good employment.

Over the last decade, I have been involved in fifty-odd projects that applied digital technologies to social challenges in education, agriculture, governance, healthcare, and finance. Throughout, the lesson has been consistent: Technology amplifies underlying human forces, but it only amplifies them. Where human forces are capable and positively inclined, technology can further improve outcomes. Where human forces are corrupt or dysfunctional, technology – no matter how well-designed – doesn’t have the desired outcome. In the end, there is a human finger on the “on” switch, and a human hand at the controls.

Ironically for technology proponents, this means that exactly those situations most in need of help are the ones where technology itself isn’t a solution.

The Myth of ICT

By thinking of technology as an amplifier of human forces, and not as a net addition to, we can effectively predict where ICT-centric approaches to solving problems will fall short. For example…

  • · If a healthcare system is too weak to deliver vaccines in rural households, then SMS helplines or mobile apps for healthcare workers will do little to increase vaccination. Vaccines are among the simplest of healthcare interventions; a system that can’t deliver them successfully is hardly likely to make effective use of digital gadgets to improve its performance.
  • · If a country has undertrained teachers or under-motivated school administrators, then neither laptops nor electronic textbooks will improve educational quality. Without adult guidance, children with electronic gadgets end up playing video games. Technology can support learning, but it cannot replace high-quality human supervision.
  • · If a society is torn by political strife or if its policies and practices do not address systemic problems like inequality, then technology penetration by itself will not alter the socio-economic landscape. The United States, for example, has experienced a golden age of ICT innovation since the 1970s, having given birth to the personal computer, the Internet, the mobile phone, Microsoft, Google, and Facebook. Yet, during that same span of time, the country’s rate of poverty has not decreased at all, and inequality has skyrocketed. Meanwhile, America’s two political parties have rarely been so polarized. Technology by itself doesn’t alleviate poverty or political conflict.

In each case, the social or institutional substrate for meaningful impact is missing, and so technology has no positive force to amplify.

A Healthy Approach to Technology

Despite its inability to cause socio-economic changes on its own, technology remains a powerful tool. What are the best ways to put it to positive use?

First, it’s important to distinguish between the production of technology and the consumption of technology. Technology production can contribute to a country’s economic growth. Technology consumption by itself does not. People cannot consume their way out of poverty. Apple is a successful company not because it uses technology (though it does), but because it produces technology that others want. It is Apple shareholders and employees who get rich from iPhones and iPads, not users. (Of course, for technology production, there must be enough of a population with the skills required to produce the technology – developing those skills again requires strong educational institutions.)

Second, for anyone constrained to use ICT as part of a development programme, the best approach is to collaborate with entities already having the relevant kind of impact. For example, if improving rural healthcare is the goal, look for organizations that are already having healthcare impact in remote villages. Then, seek to amplify their impact using technology. Whether they are non-profits, for-profits, or governments, it’s exactly the organizations with high capacity and commitment that are able to capitalize on technology to further their mission. (And, if you represent such an organization yourself, seek technologists eager to support your work, rather than people intent on imposing new technologies for their own sake.)

Last and perhaps most important – social, political, economic, and cultural issues must remain priorities even in an age of technology. Laws must be enacted; institutions must be well-funded and well-managed; staff must be well-trained for their roles; dialogue must occur in local communities and in the public sphere. All of this takes time, expertise, and resources, but there are no technology shortcuts. No one expects a failing for-profit company to become successful with the injection of new smartphones, social media tools, or a state-of-the-art datacenter. Why then, should we expect such simplistic solutions to address the more complex problems facing governments and nations?