Guest post by Alex Hoffman (@alexbhoffman) of sidewinder.fm, a music and tech think tank.
Smartphones have changed nearly every aspect of our lives. They have become the personal assistant and digital companion that keeps us punctual, social, and sane. Still, these devices don’t quite shine until we push them – manually launching apps and inputting information.
Pandora demands an artist, Urbanspoon requires a cuisine, and Maps wants a destination.
These minor, but collectively time-consuming requirements signal that our devices have yet to integrate our physical settings with our personal interests.
For instance, when we’re at home on the couch, we’re typically in the mood for relaxing things — be it a good TV show or book. And when we’re working out at the gym, we often seek out motivational music. These contexts are easy to generalize, but extremely hard to personalize. The right show, book, or playlist can vary infinitely depending on our taste, personality, and culture.
Many of today’s smartphones, though, house over ten sensors that gather all kinds of information about us. The accelerometer measures our speed, the GPS our location, and the radio our connections. So too, a microphone hears our sounds, the camera sees our sights, and a gyroscope feels our motion.
By coupling this sensory information with third-party data, our devices are beginning to usher in a new culture driven by context, wherein experiences and recommendations can be automatically catered to us. Instead of manually tapping to set our Android’s alarm at bedtime, it be will able to infer from our Google Calendar appointments and Google Maps traffic data just how long we need to commute to make our first meeting on time and wake us accordingly.
Continue reading the rest of the story on Hypebot