Steve Jobs first announced the original iPhone at the Macworld convention in January 2007, the rest as they say, is history.
He famously said at the Keynote;
“We wanna reinvent the phone.
So, what’s the killer app?
The killer app is making calls!”
However, there was a very certain irony to the fact that this one feature (calling) that every phone prior to the iPhone had managed quite easily almost lead to the downfall of this revolutionary product. It wasn’t the phones ability to make calls that was the issue, more it was the fact that because of the new and revolutionary ‘all glass’ touchscreen design of the iPhone, this presented an issue that had not previously existed… how to stop a customer’s cheek from inadvertently activating the touchscreen, and potentially even ending a call.
Most phones before the iPhone featured a small (none touch) screen at the top and a plastic keyboard at the bottom (Jobs was known to not be a fan of this design and this inspired him in the design of the iPhone).
Just months before the iPhone was due to go on sale to the general public the iPhone was fraught with problems that ranged from buggy software, faulty antennas and a battery that couldn’t hold charge — these problems could all be addressed, but it was said that the issue of a customer’s cheek ending phone calls that could be the real deal breaker.
The iPhone was well into development stage before the
proximity sensor
was conceptualised but it was nowhere near ready for release, all the other bugs could be (relatively) easy resolved, but there was still no solution to the problem, a problem, if not fixed, would have caused a delay, or even cancellation of the iPhone!
How does the proximity sensor work on iPhone
The proximity sensor in the original iPhone works in fundamentally the same way that all modern iPhones do; they detect when the phone is lifted to your ear and turns off the display, and the other way around when done with a call. It works via the emission of infrared radiation — if the radiation hits an object (i.e. your cheek) and bounces back, it is detected by a receiver. There is however more to it than that — if the small burst of radiation reflects off an object nearby then it will be more intense than an object much further away. It was therefore necessary that the sensitivity be set just right in order ensure it worked correctly.
It was a tricky issue to fix as there were so many variables: skin colour, hair length and hair colour and it had to work faultlessly for them all. The reflection of radiation is influenced by the colour and appearance of the surface — light, shiny surfaces bounce back far more light than darker colours, which absorb light. Shiny objects on clothing for instance could cause the proximity sensor to inadvertently turn off the display, even when a user wasn’t on a call.
As the iPhone’s release deadline drew nearer, Apple engineers had to work day and night to find a solution, it was even said that an Apple engineer with ‘extremely dark hair’ was asked to donate some of his hair so that it could be put into a device to test and calibrate the sensor to ensure that it worked as expected.
None the less, the Apple engineers and design team worked around the clock and got the iPhone Proximity Sensor to work and function in much the same way that it does now — the display goes off when at your ear and comes back on when you’re done with your call.
The iPhone proximity sensor was such a huge feature at the time, it was listed as a major feature on the product page on Apple’s website.
The technology has stuck ever since, it was of course not without a few issues, most notably the iPhone 4 received numerous reports and criticism that its proximity sensor didn’t function.
None the less the technology has lived on ever since, were it not for the engineers at Apple coming up with a solution to a newly made problem, then the iPhone may never have been.