by Liang Zheng (Postdoctoral Research Associate, Princeton University) and Carlee Joe-Wong (Assistant Professor, Carnegie Mellon University)
What are the economics behind the emerging market for fog computing? Fog computing enables advanced digital deployments in AI, IoT and 5G, and also creates new business models from new services. Yet there are additional economic factors at stake. In this article and in the short video, we take a look at the network economics pricing policies and incentive mechanisms for networked or computing systems.
451 Research recently forecasted that the worldwide market for fog computing will reach $18 billion by 2022. Yet realizing these benefits requires careful design and analysis of the economic interactions created by fog computing. We call this fogonomics.
Fogonomics, defined as the economics of fog computing, deals with the economic factors that affect the design of fog architectures, its economic consequences on end users and operators, and the economic interactions between end users and their service providers.
Internet and cloud service providers have successfully used pricing as a tool for both congestion/demand control and revenue generation. Carefully designed incentive mechanisms can even enable new Internet-based services: a judicious incentivizing mechanism, for instance, would enable the pooling of distributed, dispersed resources in fog computing and the Internet-of-Things (IoT). Though fog computing is expected to lead to lucrative new business models such as fog-as-a-service, the research value and social impact of fogonomics are still un-studied. Indeed, the main research problems in this emerging area are just beginning to be formalized.
In this blog post, we illustrate some representative research problems in fogonomics by considering the challenges faced by user devices trying to extract economic benefits from fog computing or use fog computing to define new business models.
The Economic Opportunity for Fog
Fog computing itself radically extends the current definition of “cloud computing” to include the growing number of powerful edge devices and smart clients among end users. For example, wearable devices, such as activity trackers, might utilize connected smartphones as a “cloud server.” These devices could then use the smartphone’s computational capabilities to analyze the data that they collect, instead of sending this data over the backbone network to reach a remote cloud.
Fog computing thus creates an economic opportunity for users to save on both data transport and computation costs. Devices such as drones, cars, or phones can become mobile, cost-effective “clouds” that charge users for their computational services. That’s an example of one business model enabled by fog computing. Creating these new business models, however, is non-trivial.
Challenge #1 – Quantifying Distribution Benefits
The first challenge faced by user devices is that they may not know how to quantify the economic benefits of distributing their functionalities among fog devices. For example, they can’t estimate the cost of sending data to the cloud or renting computational resources on cloud servers.
Migrating cloud capabilities to the edge has performance implications. A single drone, for example, may have too many images or videos to process on its own, but it may need instantaneous feedback from this image processing – say, if other drones are waiting for the processed images to visualize a scene.
Purchasing more bandwidth to rapidly send all of the drone’s images to the cloud for processing may meet this latency requirement, but could also be too expensive. To meet the stringent latency requirement within a reasonable cost, we must then rely on “crowdsourcing” the data processing and storage to nearby fog devices, while strategically collaborating with the remote capable cloud. Users may drop from cloud or fog services altogether if their performance requirements cannot be met.
Challenge #2 – Designing Pricing Schemes
The second challenge to realizing new business models is the fact that service providers can offer a large variety of pricing schemes to attract a higher market share and to shape user demand. Even today, cloud providers offer various types of pricing that supplement the conventional pay-as-you-go model. These may include auction-based pricing, where users can bid for spare cloud resources at their desired prices, and volume-discount pricing, where users are charged a lower unit price if their usage exceeds a given threshold.
In the context of IoT, serverless cloud computing has been invented by Amazon AWS for real-time processing and rapid control loop for IoT applications. Its pricing depends on the number of computation requests and time of computation given the chosen memory size.
Pricing models like these, which are ideal for fog applications, can significantly contribute to user satisfaction and possibly even shape user demand so as to reduce congestion and provide better quality of experience. When designed properly, they can even generate higher revenue/profit margin, less churn, and other benefits for fog providers. These pricing designs are the main challenge faced by providers in a fogonomics context.
Challenge #3 – Incentivizing Resource Sharing
A fogonomics business model of devices selling resources to users must take into account the need to occasionally share resources with other devices. The concept of a sharing economy is not new in the mobile data market: Google has introduced a cross-carrier data plan called Project Fi that resells pooled T-Mobile, Sprint, and US Cellular infrastructure.
Similar ideas can be applied to sharing computation, communication, and storage capabilities among fog devices. Yet the economics of such sharing environments is currently under-studied. Devices’ willingness to share resources with each other or offer pooled resources to users can be controlled by incentive mechanisms, such as micropayments in fog networks.
Such an ‘uberization‘ of network resources may require surge pricing to ensure the trustworthiness of devices offering resources, yet it is unclear how such surge pricing, or indeed devices’ incentive mechanisms in general, should be designed.
The era of fog computing has begun, and it will reach beyond the IoT to encompass AI, 5G mobile networks, and more. Fogonomics, and the research problems identified here, will play a significant role in the future of fog computing by enabling it to become a multi-billion-dollar marketplace.
About the author:
Liang Zheng received her bachelor’s degree in software engineering from Sichuan University, Chengdu, China, in 2011, and her Ph.D. degree in computer science from the City University of Hong Kong, Hong Kong, in 2015. Upon the completion of her PhD, she joined Princeton University EDGE Lab as a postdoctoral research associate, working with Professor Mung Chiang, founder of the Lab and co-founder of the OpenFog Consortium. Her research interests are primarily in understanding user behavior in computing systems, particularly from an economic perspective.