Caching

Caching is a technically motivated cross-cutting concern that becomes interesting once applications face issues in performance, such as slow external systems, expensive and cachable calculations, or huge amount of data. In general, caching aims to lower response times by storing data that is costly to retrieve in a potentially faster cache. A typical example is to hold responses of external systems or databases in memory.

Before implementing caching, a question that needs to be asked is whether a cache is required or even possible. Some data doesn't qualify for being cached, such as data that needs to be calculated on demand. If the situation and data is potentially eligible for caching, it depends on the situation if another solution other than caching is possible. Caching introduces duplication and the possibility of receiving outdated information and, generally speaking, for the majority of enterprise applications, should be avoided. For example, if database operations are too slow, it is advisable to consider whether other means, such as indexing, can help.

It depends a lot on the situation and what caching solutions are required. In general, caching directly in memory in the application already solves a lot of scenarios.

The most straightforward way of caching information is in a single place in the application. Singleton beans perfectly fit this scenario. A data structure that naturally fits the purpose of a cache is a Java Map type.

The CarStorage code snippet presented earlier, represents a singleton EJB with bean-managed concurrency containing a thread-safe map to store data. This storage is injected and used in other managed beans:

@Singleton
@ConcurrencyManagement(ConcurrencyManagementType.BEAN)
public class CarStorage {

    private final Map<String, Car> cars = new ConcurrentHashMap<>();

    public void store(Car car) {
        cars.put(car.getId(), car);
    }

    public Car retrieve(String id) {
        return cars.get(id);
    }
}

If more flexibility is required, for example pre-loading cache contents from a file, the bean can control the life cycle using post-construct and pre-destroy methods. To guarantee functionality to be executed during application startup time, the EJB is annotated using @Startup:

@Singleton
@Startup
@ConcurrencyManagement(ConcurrencyManagementType.BEAN)
public class CarStorage {

    ...

    @PostConstruct
    private void loadStorage() {
        // load contents from file
    }

    @PreDestroy
    private void writeStorage() {
        // write contents to file
    }
}

Interceptor can be used for adding cache in a transparent way, without needing to programmatically inject and use a cache. The interceptor interrupts the execution before a business method is being called and will return cached values instead. The most prominent example for this is the CacheResult functionality of the Java Temporary Caching API (JCache). JCache is a standard that is targeted for Java EE but, as of writing this book, not included in the umbrella specification. For applications that add the JCache functionality, the eligible business methods are annotated with @CacheResult and transparently being served by a specific cache.

JCache, in general, provides sophisticated caching capabilities for scenarios where simple Java EE solutions are not sufficient. This includes distributed caching provided by JCache implementations. As of today, caching solutions typically being used are Hazelcast, Infinispan, or Ehcache. This is especially the case when several caches need to be integrated with specific concerns, such as cache eviction. JCache, and its implementations, provide powerful solutions.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.26.138