Architectural Considerations for a World of Devices
by Atanu Banerjee
Opportunities With Devices
What Will User Experience Look Like?
What architectures are needed to build end-to-end solutions that support devices?
Applying a Software + Services Approach to Devices
Services to Support Devices
About the Author
Summary: The number of mobile device users is rapidly increasing, but the promise of solutions that leverage networks of connected devices has remained largely unrealized. Some of the pieces needed to build rich, connected experiences were not available until recently. This article explores some of the economic, social, and technology trends that are driving the adoption of mobile devices; describes the different kinds of user experiences that are now becoming possible; and presents an overview of architectural concerns associated with such mobility solutions, at the levels of hardware, software, connectivity, and services capabilities.
After 10 years of hype, mobility solutions are finally taking off. You could ask, “Why now? What has changed? Are there any new opportunities to consider? Should I consider mobile devices in my solutions?” It turns out, economic, social, and technology trends are accelerating the move to devices. There is a broad spectrum of devices with different form factors, running different kinds of applications, as shown in Figure 1, and associated with different sets of trends.
Figure 1. Range of mobile devices, and the kinds of applications that run on them
An important driver of adoption for cellular phones has been emerging markets. For example, BusinessWeek reports that Nigeria had 500,000 telephone lines in 2001, but now has more than 30 million cell phone subscribers. It is currently estimated that there will be 5 billion cell phone users in the world by 2015.
Adoption of cell phones drives adoption of services for cell phones. In Asia, many services leverage high-end devices to deliver rich, interactive media. These require higher end smartphones and pocket PCs shown in Figure 1. However, many people in emerging markets cannot afford such devices, so services that target lower end volume handsets are also being rolled out. In Kenya, Safaricom rolled out an SMS-based service for mobile payments in March 2007, called M-Pesa, that has been widely adopted. Unsurprisingly, access to improved communications can also be hugely beneficial to local economies. Dr. Robert Jensen at Brown University conducted a study of Indian fishermen who started using mobile phones to find the best coastal marketplaces for their catch. While the fishermen saw profits increase 8 percent, consumer prices actually dropped by 4 percent because less fish was being wasted.
Today in Helsinki, Finland, 57 percent of public transport single tickets are paid by mobile phone. In Croatia, over half of all parking is paid by mobile phone. Twenty percent of London’s congestion charge is paid by mobile phone. (See Resources: Mobile Phones As Mass Media.)
There are now over twice as many mobile phones worldwide as there are personal computers. The wireless industry used the opening of its largest trade show in March to outline opportunities for a “threescreen” world (PC, TV, mobile), in which mobile devices become major avenues for TV shows, music, games, and advertising. For many younger consumers, it might even be argued that the mobile device is the most important of those three screens.
Accompanying the growth in devices, the evolution of the Internet is leading to new usage patterns. Today’s solutions are differentiated from older ones by their global reach and scale, which lead to new channels for user participation. For example, sporting goods company Nike sells the Nike+, a small sensor that fits into a runner’s shoes and tracks his progress on an iPod that the runner also carries. When the Apple iPod is connected to a PC, details of the runner’s runs get posted to the Nike+ Web site, a social networking site for runners, so that runners around the world can form groups to track each other’s routes and progress.
The Internet is also changing the way that content gets created. Blogging is making content publishing less impersonal as readers are closer to authors. Content is also becoming interactive and social with online gaming, chat, and the advent of communities around user-generated content. This trend will accelerate with the proliferation of mobile devices that make it easy to capture content on a device to edit content directly on the device, attach context to the newly created content and then upload the content to storage either on a PC or in the cloud. In some P2P scenarios, it is also becoming possible to share content directly from the device itself.
This transformation of content pipelines makes it more likely that content creation will be triggered by external events rather than on fixed schedules. The handiness of devices makes it increasingly likely that a means of recording an event will be at hand when it happens, leading to the spontaneous creation of new content that might not have been captured otherwise. Not surprisingly, the amount of content being generated and stored has exploded. The upsurge in spontaneous citizen reporting has resulted, on more than one occasion, in footage from a mobile device leading to dramatic public reaction.
The first trend that is reshaping the industry is that of convergence. Today people use a wide variety of devices—smartphones, PDAs, laptops, personal media players, cameras and camcorders. It is expected that these technologies will converge into more powerful, general-purpose personal computing devices that can be used for a wide variety of business and consumer-oriented tasks. Convergence in networks will mean seamless handling of both voice and data over the same protocols.
Convergence leads to the second trend: Devices are getting smarter. A new generation of smartphones is becoming increasingly aware of the user’s environments and local context, through sensors (such as GPS or accelerometer) and better software on the device. This context might be used to tag content (for example, tagging a photograph with time/ location metadata), to tailor application behavior (no application alerts when the user is on a phone call, for example), or to control the user’s local environment (such as settings in a car self-adjusting based on the driver’s identity which has been retrieved from a device on the driver’s person). Networks will also extend to cover devices and agents distributed around the body over protocols such as Bluetooth — which is the idea of Personal Area Networks (PANs). All this will lead to new architectures where devices are much more than just information displays — they will become first class application platforms in their own right. Not only that, but there will likely be some scenarios (such as PANs) where some devices act as servers for other devices (in client-server architectures), or as superpeers / index servers (in P2P architectures).
This is important as mobile device applications need to provide a user experience that is very different from that of a desktop. The key characteristic of mobile users is that they are engaged in some other primary activity. A device-based application should not force its users to make accommodations, but instead fit into people’s lives and lifestyles by being context-aware, nonobtrusive, and ready to provide value rapidly at short notice.
The third trend is the mobile Internet, a collection of Web sites and services specifically targeted at mobile devices and available over Internet protocols. Growth of the mobile Web will accelerate consumption of Internet-based applications and services from mobile devices, which today is constrained by device and mobile access plan limitations.
The combination of these three trends will result in a move toward pervasive computing. As devices proliferate and become smarter, more computing power will be embedded at the edges of the network. As devices become better at handling user context, they will become increasingly unobtrusive. As devices become better networked, and as the mobile Internet evolves, users will have available a rich set of services that can make use of this personal context. The borders between human environments and computing devices will gradually blur, and users will get the sense of being assisted by their immediate environments. Large numbers of embedded computing devices will force new solution architectures to handle emergent challenges around user experience, device management, security, content management, and so forth. Network access needs to become universally available. Although we are clearly not at the point of pervasive computing yet, we are moving in that direction with the growth of embedded devices and smartphones, the spread of wireless connectivity across our environments (from work places to living spaces to some cars), and the broad availability of Internet services to be accessed by devices.
As devices become more common, software will need to span a mesh of Web-connected devices and embrace the increasing pervasiveness of the Internet, a core pattern of Web 2.0 described by Tim O’Reilly as “software above the level of a single device.” Devices are used in multiple physical and virtual spaces (Figure 2, below).
Figure 2. User experiences in a world of connected devices span multiple physical and virtual spaces. Each of the blue dots represents a physical environment (a room, the home, workspace), a social environment (friends, family, colleagues), a virtual environment (profile pages, virtual world, online game) or a subscribed online service.
Experiences Related to Me
The devices in these scenarios are typically used for communications (phone, email), gathering of content (mobile search), consumption of content (personal media players), and for health monitoring (heart rate monitor). The application scope in these scenarios centers on information gathered about you, created for you, or consumed directly by you. Relevant information includes credentials (Live ID), contacts, messages, presence information, and personal content (audio, video). The types of devices that are important include cell phones, smartphones, PDAs, Ultra-Mobile PCs (UMPCs), laptops, and health monitors. The connectivity needed is for Bluetooth Personal Area Networks (PAN), Health sensors, and so on.
Experiences Related to My Local Environment
As described earlier, the proliferation of smart devices will lead to spaces where the boundaries between a person’s immediate environment and computing devices in that environment are blurred. This will be achieved through networked devices with built-in sensors (GPS, accelerometer, ambient light sensor), that understand user context, and are unobtrusive in their actions, so that users get the sense of being assisted by their environment.
An example of such an environment is the Microsoft Auto solution, which connects a user’s devices (such as mobile phones and portable media players) into a single in-car system that can be operated with the driver’s voice or buttons on the steering wheel. Ford Motor Company will roll out a solution called Ford Sync in 2008, which will enable next generation mobile user experiences: for example, users entering a car while talking on a mobile phone and can press a button on the steering wheel to have the phone connect to Sync without interrupting the call. Another case of extending an automotive environment with devices is that of OnStar, which provides security and roadside assistance: within the car, a communications device is connected to the radio, a GPS antenna, and a microphone via an on-board network (or bus).
Conference rooms are being extended with devices as well. Microsoft RoundTable is a combination video conferencing camera and microphone that uses sound and motion detection to automatically shift focus to the current speaker. Eliminating the need for speakers to move to face a fixed camera when they start to speak is in line with the idea of devices becoming less obtrusive in their actions.
In some cases, devices may need to share information with other devices also in a user’s vicinity, as well as with Internet-based services, raising issues around discovery, handshaking, shared understanding of the user’s identity and context, and so on.
Experiences Related to My Remote Environments
Remote environments are similar to local environments, in that they are spaces where devices gather information and take actions. However remote environments are not in the immediate vicinity of the user of the device. In other words, scenarios and experiences that relate to a users remote environments allow that person to connect to, monitor, and work with devices at other locations. Reasons to do this would be to monitor or even control the environment at remote locations as a safeguard against criminal activity or for other reasons. For example, people might be interested in remotely monitoring their homes or workplaces (such as a data center), or even loved ones (children or elderly parents). A simple example of a device that does this is a baby monitor. Businesses might want to monitor remote locations; there are a host of logistics-related scenarios using RFID devices in this category as well (for example, to ensure the electronic pedigree of pharmaceuticals as they are transported through a supply chain, in order to eliminate counterfeit drugs).
Experiences Related to My Social Networks
Figure 3 shows that mobile devices fit into social networks in the same way personal computers do. Users use their devices to search for and find people and their content, to coordinate with their friends and relatives, and to share content with others.
However, the reach and scale of devices is much broader than personal computers, and social interactions are more spontaneous (when the user is on the go, her camera phone is always at hand). Applications and services need to accommodate such scenarios, where the user’s attention span is sharply reduced.
Figure 3. Social networking on devices
Challenges in Delivering Rich Experiences on Devices
There are many differences between devices and personal computers, and it would be a mistake to consider devices as just smaller versions of PCs. In order to deliver rich experiences onto devices, solution architects need to consider a number of constraining factors, much more so than when delivering applications to a personal computer. These constraints include hardware capabilities of the devices themselves, device operating systems and application runtimes, development tools, connectivity choices, and also available services running on the Web. Some of the challenges in building experiences for devices are:
- Unlike PCs, people consider devices to be accessories:
Users carry devices on their person and often view them as expressions of
their lifestyle or even personal self-image. Coolness factor, great
design, and user experience are critical.
Implications: Rich device design and presentation capabilities are needed—both for the device itself as well as for the applications running on the device.
- Devices have limited resources. As devices (and their
batteries) become smaller and lighter, screen sizes and layouts become
more restrictive, and the “power budget” to support rich applications
shrinks. More complex devices tend to have shorter battery lives than
simple bare-bones phones. Memory available on a device is limited as well,
although this is mitigated by improvements in storage technology. Devices
support for Wi-Fi often comes at the price of shorter battery life as
Implications: Energy efficiency is critical. Device operating systems should have fine-grained control over hardware utilization. In some cases, application processing should be offloaded from the device to avoid excessive resource consumption. This leads to a trade off with the first two items in this list.
- Devices are not standardized. Unlike PCs, the form factors
and hardware/software profiles of devices are much less standardized.
Implications: A developer of applications for mobile devices needs to target lowest common denominator for screen size, shape, and orientation in order to deploy to a wide variety of handsets. This leads to an implicit trade off with the need for rich user experience; solution architects will have to optimize both software and hardware together in order to get the best overall experience. Application developers for embedded devices have a different set of challenges in this area – lack of standardization in hardware make it hard to presume the set of resources that will be available to the application.
- Devices need to support offline scenarios and occasionally
slow connections. For many reasons, devices are not always connected
(connections may not be cost-effective during international travel, for
example) or access speeds are not fast enough to support decision-making
at the point of need (using online maps to make routing decisions while
Implications: Devices need to be more than thin client displays; they need to be application platforms in their own right. A key requirement for this capability is local processing and storage, with synchronization with PCs or Internet-services.
- Connectivity is not standardized. Although several
standards exist for network protocols, there are several ways for a user
to access information and services on the Internet from their mobile
devices. Depending upon the capabilities of their device, and the service
plan that they have with their network operator, a user might want to
access information and services on the mobile internet via voice (through
voice recognition), messaging (SMS, email), or through Internet protocols
(WiFi, tethered connection, or appropriate data plan). In emerging markets
services will probably need to be delivered over SMS or voice, as users
are more likely to have volume handsets with limited capabilities. For
rich media experiences, it is likely that a fast Internet protocol will be
Implications: Tailor access to the kind of experience being delivered and the market to which it is being delivered. It is possible that services will need to be accessible to devices from a number of different end points, each supporting a different address and method of access.
The last issue of The Architecture Journal covered the emerging paradigm of Software + Services. In the context of a mobile application, the goal is to combine the best of the Web with the best aspects of devices—but subject to the constraints just described. As shown in Figure 4, solution architects need to design for a specific type of user experience, and pick an appropriate device based on minimum device capabilities needed (hardware and software on the device) and connectivity options and services available to users of the device.
Figure 4. A conceptual framework to apply Software + Services to a mesh of devices
Hardware Form Factors of the Device
Just as there is a wide range of mobility-based scenarios, there is a wide range of device form factors to support these scenarios as shown in Figure 1. The choice of device depends on how it is going to be used.
For example, in the consumer personal device space, the spectrum of available hardware ranges broadly from WM (Windows Mobile)-based devices on one end, laptops on the other, and UMPC devices in the middle. WM devices are usually either smartphones or pocket PCs (typically under 5-inch screen size), and UMPCs are usually portable digital companions (typically 5.6- to 7-inch screen sizes).
Sensors to Receive Inputs From the Immediate Environment
Devices can receive inputs from a variety of different sensor elements listed in this section. Some of these sensors will provide channels for users to interact with the software running on the device. Other sensors on the device will provide applications with a view of the user’s current context at any time so that software can adjust itself accordingly.
- Touch Technologies. Some devices use touch technologies to improve user experience. Newer devices use “capacitive touch,” which does not require pressure to register touch (unlike the “resistive touch” found on older devices, which often required a stylus). Devices with “capacitive touch” are easier to use, more accurate, and more responsive. The older touch screens were often not very clear in sunlight, which made it harder to see rich media, but the newer touch screens are typically brighter, as their surface isn’t covered with the thin film required for “resistive touch.” Another advance is multitouch (the ability to handle input from more than one finger at the same time), which lets users resize a window by pinching or expanding two fingers on the screen. A common user objection to touch screens is their lack of tactile response. However some handset manufacturers are adding the tactile-feedback technology found in game controllers (for example, to give a slight vibration when a touch screen’s virtual keyboard is tapped). This will be similar to the response that users are accustomed to getting from traditional mechanical keyboards.
- GPS. Many mobility solutions depend upon knowledge of the user’s location. A common technique for a device to determine its location is GPS (or Global Positioning System) which does so based on line of sight from three or more satellites (which means that GPS cannot be used indoors). Some location-based services on the Web will automatically use GPS information on the device (when available) to provide information filtered by the users location (Live Search).
- Accelerometer. Some devices have accelerometers built in. A basic usage of this is to automatically detect the orientation of the device, that is, landscape or portrait mode. More advanced uses of the accelerometer include gesture recognition, media control, or game control. In the future, it is quite conceivable that accelerometers on devices could lead to sophisticated control scenarios similar to those of the remote control of the Nintendo Wii, which is built around an accelerometer.
- Health monitoring sensors. Examples are sensors for heart rate or blood sugars.
- RFID. RFID readers, scanners, and printers are a range of devices that use RFID technologies in primarily enterprise scenarios such as logistics and supply chain management. Devices with built-in RFID sensors are meant to replace an older generation of devices that use bar code scanning.
- Other Sensors. Other sensors found on devices might include ambient light sensors (to control screen brightness and preserve battery life) or proximity sensors (to turn off display when the device is being used as a phone).
Software Running On the Device
Software running on the device falls into the following categories:
- Device Operating System
- Application Platform—both the application runtime and design tools
- Mobile Browser—this is emerging as an application platform in its own right for consumer devices
The operating system should be chosen based on how the device is to be used, as there is an implicit trade off between managing limited device resources and the richness of applications running on the device. Devices have different needs; let us look at the software stacks for each type of device represented in Figure 1.
Software Stack for Embedded Devices. Windows Embedded CE is a hard, real-time, 32-bit, memory-protected operating system kernel that can support a wide range of processor architectures (ARM, MIPS, x86, or SH4). It comes as a set of about 700 components, from which a subset can be packaged into custom images. For example a kernel only image can be assembled that boots with an approximately 300 KB footprint, but it is also possible to add other technologies into the image—such as Web server, browser, media player, networking support, .Net Compact Framework—all of which increase the size of the OS image. Devices built with Windows Embedded CE might be headless, or might have some form of display. Also devices can either be open (that is, exposing application APIs) or closed (without a third-party developer story).
Windows CE is available to the general embedded system development community to build their own devices. It is also used within Microsoft to build the Windows Mobile and Microsoft Auto solutions. Windows Mobile is used to power smartphones and PDAs, while Microsoft Auto is a platform for the auto industry to build advanced in-vehicle solutions.
Software Stack for Smartphones and PDAs. Windows Mobile chooses its own set of operating system components from Windows Embedded CE, with a custom shell, device-specific technologies (connection manager), and some applications (Office Mobile). Windows Mobile OEMs often add their own specific applications and services to the image (screen plug-ins, applications like VoIP, games), but do not customize the set of components in the base WM image. The result is a consistent set of APIs that are offered across all Windows Mobile devices: In theory, applications written for one Windows Mobile device should work across all Windows Mobile devices. In reality, mobile devices vary greatly in their hardware capabilities (connectivity options, screen size, resolution, orientation), making it difficult to build an application that works well across all devices, even when the underlying APIs are the same. Figure 5 shows the range of development options for building application interfaces on a Windows Mobile smartphone.
Figure 5. Development options for building application interfaces on a Windows Mobile smartphone
Software Stack for Ultra-Mobile PCs. UMPC devices get the full fidelity software stack—the Windows Vista operating system, the .Net Framework as the runtime for managed applications, and IE7 as the browser. Existing PC applications do not need to be rewritten to run on an UMPC—although they might be extended to support touch and ink (these capabilities are now built right into Windows Vista). However, all this comes at the cost of battery life (often just 4-6 hours), as UMPCs do not manage device resources at the granular level the way Windows Mobile does.
Access Channels to Support Devices
Mobile devices such as cell phones are used primarily for communications—mostly voice calling today—but also some other forms of messaging, such as email, instant messaging (IM), and SMS. Beyond these basic communication services, it is expected that devices will connect to a much richer set of application services in the future. While there is a relatively large market for such services, it cannot be assumed that users of those services will always have advanced Internet-capable devices that either have a carrier data plan or are WiFi-enabled. This leads to service delivery to devices over other channels as well (such as SMS or voice, as in the previously discussed SMS-based services for mobile payments in Kenya).
Some examples of network channels over which applications and services may be delivered are:
- Voice Recognition: These services are accessed through a phone call, with voice recognition software running on the other end. Figure 6 shows how Microsoft Office Communications Server can be used to deliver speech-enabled applications that can either be accessed through telephony application services, or through alternate channels. As an example of such a mobile application, consider Live Search on Windows Mobile devices, which can be accessed through a voice interface in the U.S. Voice recognition software converts the users speech into the search query string; results are then displayed as usual.
- SMS: Some basic services are now being offered over SMS, such as stock updates, alerts, and, in some countries now, Internet search. For example, Microsoft Research did a project in India, where they built an SMS-enabled solution for a sugarcane cooperative. Farmers can use their phones to get information (such as market price information) by sending in requests as SMS messages. The responses are also sent back to them through SMS. Microsoft Research has made the toolkit used for this project available as a shared solution on CodePlex. (See Resources: SMS Server Toolkit.)
- WiFi: This works well for devices equipped for WiFi, when in the vicinity of an accessible wireless hotspot.
- Mobile data plan: This typically is a premium service offered by mobile network operators, to provide Internet access over the operator’s own cellular network.
- Connection to other devices using P2P (Peer-to-Peer) technologies
- Some devices can exchange information with another device directly, without having that information pass through a central server. Architectural considerations, such as discovery and handshaking, can be accomplished in two ways:
i. A central index server brokers the connection: For example consider the XBox 360 and the XBox Live. When a user logs into his XBox 360, he can join a group of up to 16 gamers that play within a single session. Although the Live service tracks who is online and brokers the initial connection into the group, all further messaging between XBox 360 consoles is direct through P2P technologies and not through the server.
ii. Without a central index server: Discovery of nearby devices, and the handshaking between them, happens directly, without going through a central server. For example, the Zune music player can share currently playing music with up to three other Zunes in the vicinity via P2P technologies.
Figure 6. Using Speech technologies in Microsoft Office Communications Server to deliver voice enabled services over the internet
Beyond basic communication services, it is expected that in the future, devices will connect to a rich set of services on the Internet. These services will likely be architected in three tiers:
- Application and solution services. These offer support that is specific to a set of scenarios, such as health or CRM.
- Attached or third-party services. These services offered by other providers are attached onto the application services; for example, a mobility solution for healthcare providers using services for email, update, or collaboration provided by other parties.
- Utility/infrastructure/building block services.
Services in the second and third tiers represent common, horizontal capabilities which cut across many different application services. Some of these are described below in the context of mobility solutions. In the future, these services would typically be provided by a platform provider, such as Windows Live Platform, or even by a mobile network operator.
Device Management and Security
Attack vectors for devices are similar to that of networked personal computers, except that devices are much more likely to get lost or stolen, and it is often harder to secure them physically given their mobility (as opposed to a computer sitting on a desk or in a data center). So there are three primary areas in which to consider security issues on devices. The first is around securing the device itself. The second is around securing the network—that is, ensuring message confidentiality and integrity. A number of security issues here can be addressed in layers of the networking stack (for example, radio modulation techniques to provide wireless signal transmission security, IPSec, and so forth). The third area of security is securing the applications that run on the device—or run on the Web but are accessed through devices—and that is described in the section on identity and access management.
In some cases management of devices is like that of personal computers. For example, devices need to be upgraded with patches (firmware and software), media, or applications. Communications to devices need to be secured, and in some cases metered, and paid for (for commercial downloads). However, management of devices does differ in some crucial respects because devices are often much harder to secure: Mobile devices are easy to misplace, and in many cases, access to the devices cannot be restricted.
Management services need to be provided to devices that connect into a network, to answer the following kinds of questions.
- Network administrators: “How many devices are on my network right now? How much bandwidth? How much time? What types of devices are there?”
- Helpdesk: “What is the history for your device? Has your device been updated? What are the details of your device?”
- Security administrators: “What devices don’t have my security update? What if I enforce this policy? How many devices are in compliance?”
- User: “I just lost my device and I’d like to safeguard private information on it. Can you wipe it remotely?”
Note that simply locking down a device isn’t enough if it has a 2 GB storage card filled with sensitive information when it gets lost. One way that Windows Mobile 6 handles this problem, is by encrypting storage card data so that it can only be read by the device that wrote to it.
Identity and Access Management
Applications running on different networked devices need a way to share credentials among themselves, as well as with back-end services that they connect to. As scenarios for devices become more sophisticated, this will require universally recognized credentials (for example, the identity of the user, and in some scenarios, the identity of the originating device as well). Today a cellular phone is identified by a phone number. Although smartphones do allow the user to connect to back-end services online, those services typically require the user to authenticate himself in multiple additional ways—which have nothing to do with the phone number, such as email address or other Web-based credentials. As devices proliferate in the future, as will the services to support them, a single universal identifier (conceptually similar to Live ID today, perhaps) might solve the authentication problem. However, other new complications will arise—how does identity get chained across devices and Web-based services? Where do boundaries of trust get established?
Although much has been done to secure networks and devices, a different set of technologies are needed to broker trust among applications running on those devices (and services that they connect to)—technologies for federated identity. These technologies help the user manage multiple digital identities and control how much personal information is shared with other devices and services. Each of these identities would be built around a set of claims which are expressions of trust from a certifying party — one or more identity services in the cloud acting as brokers of trust, issuing claims (or expressions of trust) embedded in security tokens.
Rendezvous and Presence
Mobile presence services make users’ mobile context available to their social networks. Context data, such as location, device idle time, device profile (ring volume, vibrate), and calendar information, would be available either from other mobile devices, or through a gadget/widget/badge embedded on the user’s blog or Web page. It might be displayed as a list (by augmenting a contacts list), or might be displayed on a map. In short, mobile presence should let a user’s social networks know when they are reachable and when they are not, and what their preferred mode of communication is at that moment. This information should not be specific to a user’s mobile carrier, as it is unlikely that all members of a user’s social networks are customers of the same mobile carrier. Ideally, this presence information would connect into a unified communications backbone that combines different forms of messaging.
Unlike personal computers, in the future it is likely that most consumer devices will know their exact location, possibly through a position determination technology such as GPS. This opens up the possibility for a wide variety of cloud services which can make use of that information to present to the user content appropriate for that location. The growth of online Geographic Information Systems (GIS) with published interfaces for storing data and associated spatial metadata is enabling this trend. Some of these systems are also marked up with extra user-generated content tagged by location. A location-based service would make use of the user’s geographic coordinates as an index or a filter into a GIS, to retrieve the right information. Such services might include local search, navigation, emergency services, tracking of children/pets/objects of value, multi-player mobile games, locating people in your social networks, logistics/transportation management, and so on.
Mobile Search and Advertising Services
In some ways, Web search from a mobile device is not very different from Web search from a personal computer. An analysis of Google search logs presented in “Deciphering Trends In Mobile Search” shows that despite the limitations in input techniques, the average number of words in a search query did not change much across mobile phones, PDAs, and personal computers. (See Resources: Deciphering Trends in Mobile Search.)
However, in other ways mobile search has the potential to be more dynamic than search from a personal computer. Devices are in the position to know more about the user’s current context (location). Search engines today process queries against an index that has been built up by crawling the Web. The potential for mobile search is that the extra context available to the device could also be used by the search engine when formulating its response. For example, search results might be filtered by the current location, or list of sponsored links might include a mobile coupon to a local business.
As an example of this, Live Search for Windows Mobile now includes voice input (beta), gas prices, and hours of operation for businesses. The service can also use GPS data on GPS-enabled phones to provide location-aware local search.
Storage, Content Delivery and Content Management Services
As mentioned in the section on social trends, the proliferation of devices is leading to an explosion in the amount of content that is being generated and that needs to be stored off the device. This leads to the need for storage services to back up devices, for content delivery networks to move the data, and also for content management services to organize newly acquired content so that it becomes discoverable.
Organization of content implies a taxonomy—which might be explicitly defined, but is more likely to be emergent based on tagging of content with a snapshot of the context of the user at the time that the content was created. This context might be location, time, an event, and so forth—anything that the device was aware of and automatically recorded.
As the amount of content available online keeps increasing, there is a need for users to filter the information signals that they receive. One way that users can do this is by subscribing to a particular alerting service like Windows Live Alerts. These alerts can be received on mobile devices as SMS messages. A user can also embed a gadget for reading alerts on their own Web sites.
As users move to a world of devices, content is increasingly being left scattered across personal computers at home, at work, in online services, and now on mobile phones. An important part of end-to-end mobility solutions will include synchronization services, which solve the problem of synchronizing any content, over any protocol, and onto any device or personal computer. These synchronization services would need to be able to handle subtle issues around scenarios involving caching, offline usage, sharing, and roaming. One way to build such services is to use the Microsoft Sync Framework, which lets application developers easily add synchronization capabilities to an application or service. This is enabled through a provider model that can be extended to support common scenarios such as syncing relational databases, file systems, lists, devices, PIM, music, video, and so on.
A summary view of all the development options available to an architect of mobility solutions is shown in Figure 7, with examples of specific experiences, connection points, and access channels. There are several cross-cutting concerns for devices, services, and access technologies, such as how to manage identity and trust across all these different layers.
The approach for an architect of mobility solutions should be to balance the need to achieve broad reach and scale for his target audience against the need to deliver rich experiences that users can connect with. Ideal solutions will combine the best of the Web with the best aspects of devices.
Figure 7. A summary view of the development options for building mobility solutions
- Deciphering Trends In Mobile Search
- “Mobile Phones As Mass Media: Models For Content
Distribution,” by Alan Moore
- “Safaricom: On a Tear in Africa,” by Jack Ewing, August
27, 2007, BusinessWeek
- SMS Server Toolkit
- “Upwardly Mobile in Africa,” by Jack Ewing, September 13,
Atanu Banerjee is a member of the Platform Architecture Team at Microsoft, where he works on architecture for next generation solutions. He joined Microsoft from i2 Technologies, where he worked in various roles for more than seven years, including chief architect for a supply chain management product line, development manager, product architect, team lead, and software developer. During that time, he wrote a lot of code, designed new solutions, and worked with some large manufacturing customers. Prior to i2, Atanu worked in the advanced control systems group at Aspen Technologies, designing and implementing model predictive control systems for the process industry. Atanu received his Ph.D. from Georgia Tech in 1996. He resides in Redmond, Wash. with his wife and six-year-old son.
This article was published in the Architecture Journal, a print and online publication produced by Microsoft. For more articles from this publication, please visit the Architecture Journal Web site.