It’s fairly safe to say Apple’s ditching of Google Maps for their own navigation system has proved not to be the company’s smartest move.
The humiliation of Apple was complete when the Victoria Police issued a warning against using the iPhone map application after people became lost in the desert when following faulty directions to the town of Mildura.
Mapping is a complex task and it’s not surprising these mistakes happen, particular given the dynamic nature of road conditions and closures. It’s why GPS and mapping systems incorporate millions of hours of input into the databases underlying these services.
Glitches with GPS navigations and mapping applications aren’t new. Some of the most notorious glitches have been in the UK where huge trucks have been directed down small country lanes only to find themselves stuck in medieval villages far from their intended location.
While those mishaps make for good reading, there are real risks in these misdirections. One of the best publicised tragedies of misreading maps was the death of James Kim in 2007.
Kim, a well-known US tech journalist, was driving with his family from Portland, Oregon to a hotel on the Pacific Coast in November 2006 when they tried to take a short cut across the mountains.
After several hours driving the family became lost and stuck in snowdrifts and James died while hiking out to find help. His wife and two children were rescued after a week in the wilderness.
Remarkably, despite warnings of the risks, people still get stuck on that road. The local newspaper describes this annual ritual as ‘find a tourist in the snow season’.
Partly this irresponsibility is due to our modern inability to assess risk, but a deeper problem is blind faith in technology and the algorithms that decide what is good and bad.
A blind faith in algorithms is a risk to businesses as well – Facebook shuts down accounts that might be showing nipples, Google locks people out of their Places accounts while PayPal freeze tens of thousands of dollars of merchants’ funds. All of these because their computers say there is a problem.
Far more sinister is the use of computer algorithms to determine who is a potential terrorist, as many people who’ve inadvertently found themselves on the US government’s No Fly List have discovered.
As massive volumes of information is being gathered on individuals and businesses it’s tempting for all of us to rely on computer programs to tell us what is relevant and to join the dots between various data points.
While the computer is often right, it is sometimes wrong as well and that’s why proper supervision and understanding of what the system is telling people is essential.
If we blindly accept what the computer tells us, we risk being stuck in our own deserts or a snowdrift as a result.
Paul Wallbank is one of Australia’s leading experts on how industries and societies are changing in this connected, globalised era. When he isn’t explaining technology issues, he helps businesses and community organisations find opportunities in the new economy.
COMMENTS
SmartCompany is committed to hosting lively discussions. Help us keep the conversation useful, interesting and welcoming. We aim to publish comments quickly in the interest of promoting robust conversation, but we’re a small team and we deploy filters to protect against legal risk. Occasionally your comment may be held up while it is being reviewed, but we’re working as fast as we can to keep the conversation rolling.
The SmartCompany comment section is members-only content. Please subscribe to leave a comment.
The SmartCompany comment section is members-only content. Please login to leave a comment.