Academic journal article Journal of Digital Information Management

Energy Efficient Cache Invalidation in a Mobile Environment

Academic journal article Journal of Digital Information Management

Energy Efficient Cache Invalidation in a Mobile Environment

Article excerpt

Abstract: Caching in mobile computing environment has emerged as a potential technique to improve data access performance and availability by reducing the interaction between the client and server. A cache invalidation strategy ensures that cached item at a mobile client has same value as on the origin server. To maintain the cache consistency, the server periodically broadcasts an invalidation report (IR) so that each client can invalidate obsolete data items from its cache. The IR strategy suffers from long query latency, larger tuning time and poor utilization of wireless bandwidth. Using updated invalidation report (UIR), the long query latency can be reduced. This paper presents a caching strategy that preserves the advantages of existing IR and UIR based strategies and improves on their disadvantages. Simulation results prove that our strategy yields better performance than IR and UIR based strategies.

Categories and Subject Descriptors

C.2.1 [Network Architecture and Design]: Wireless Communication

General Terms

Network Architecture, Network Design, Mobile Computing

Keywords: : Mobile computing, cache invalidation, wireless, data broadcast, invalidation report, disconnection, failure.

1 Introduction

The advances in portable computing devices and wireless technology have dramatically changed the way of information access. Wireless communication permits users to access global data from any location by carrying mobile devices. However, the mobile computing paradigm presents a dramatic discrepancy relative to currently matured wired network computing. Mobile clients in wireless environments suffer from scarce bandwidth, low-quality communication, frequent network disconnections (either volunteer or involunteer), and limited local resources (computing power, battery, storage, display, etc). Caching of frequently accessed data items on the client side is an effective technique to reduce network traffic and query latency. Bandwidth and battery power are saved, as no transmission is required for clients to access data from their caches. Furthermore, the availability of data is improved because even when a mobile client is disconnected from the network, data stored in its local cache is still accessible, making disconnection operations a possibility.

Data consistency must be ensured between client and server to prevent clients from answering query by out-of-date cached data items updated by server. In order to assist mobile clients in maintaining the consistency of their caches, a number of cache invalidation techniques have been proposed [1-14]. In these approaches, the server periodically broadcasts invalidation reports (IRs) to inform the clients about which data items have been updated during the most recent past. When a mobile client receives an IR, it can use the report to identify out-dated data items in its cache and discard them before using the cache to answer queries. However, the IR based scheme suffers from long query latency and they make poor utilization of available wireless bandwidth. Cao [1, 4, 5, 15] has proposed several updated invalidation report (UIR) based caching strategies to address the problem. Each UIR contains information about most recently updated data items since the last IR. In case of cache hit, there is no need to wait for the next IR and hence the query latency is reduced. However, if there is a cache miss, the client still needs to wait for the data to be delivered. Thus, due to cache miss, the UIR strategy has same query latency as IR strategy.

To overcome the limitations of existing cache invalidation strategies, this paper presents a synchronous stateful caching strategy where cache consistency is maintained by periodically broadcasting update reports (URs) and request reports (RRs). The center design of our strategy includes reducing the query latency, improving the cache hit ratio, minimizing the client disconnection overheads, better utilization of wireless channel, and conserving the client energy. …

Search by... Author
Show... All Results Primary Sources Peer-reviewed


An unknown error has occurred. Please click the button below to reload the page. If the problem persists, please try again in a little while.