Fog computing is a paradigm that provides services to user requests at the edge networks. As the definition suggests, the fog computing platform lies between the cloud servers and the users. In a fog-enabled environment, the devices at the fog layer usually perform operations related to networking such as routers, gateways, bridges, and hubs. Researchers envision these devices to be capable of performing both computational and networking operations, simultaneously. Although these devices are resource-constrained compared to the cloud servers, the geological spread and the decentralized nature help in offering reliable services with coverage over a wide area. Further, with fog computing, manufacturers and service providers offer their services at affordable rates. Another advantage of fog computing is the physical location of the devices, which are much closer to the users than the cloud servers. Such placement of the devices reduces operational latency significantly.
We study the suitability of fog computing in IoT environments and theoretically modeled its parameters to support IoT applications. We then study its impact on the IoT environments from different perspectives: computation offloading, while reducing operational latencies and energy consumption. We also offer a detailed analysis of the behavior of the devices and enhance the QoS for the users.