Connection pooling and caching
In the previous tutorial, we added persistence to a website using the Exposed framework. In this tutorial, we'll look at how to implement database connection pooling and caching using the HikariCP and Ehcache libraries, respectively.
Add dependencies
First, you need to add dependencies for the HikariCP and Ehcache libraries. Open the gradle.properties
file and specify library versions:
Then, open build.gradle.kts
and add the following dependencies:
Click the Load Gradle Changes icon in the top right corner of the build.gradle.kts
file to install newly added dependencies.
Connection pooling
Exposed starts a new JDBC connection inside each transaction
call when it performs the first manipulation with the database in the scope of this transaction
. But establishing multiple JDBC connections is resource-expensive: reusing the existing connections could help to improve performance. The connection pooling mechanism solves this problem.
In this section, we'll use the HikariCP framework to manage JDBC connection pooling in our application.
Extract connection settings into a configuration file
In the previous tutorial, we used hardcoded driverClassName
and jdbcURL
in the com/example/dao/DatabaseSingleton.kt
file to establish a database connection:
Let's extract database connection settings to a custom configuration group.
Open the
src/main/resources/application.conf
file and add thestorage
group outside thektor
group as follows:storage { driverClassName = "org.h2.Driver" jdbcURL = "jdbc:h2:file:" dbFilePath = build/db }Open
com/example/dao/DatabaseSingleton.kt
and update theinit
function to load storage settings from the configuration file:import io.ktor.server.config.* import java.io.* object DatabaseSingleton { fun init(config: ApplicationConfig) { val driverClassName = config.property("storage.driverClassName").getString() val jdbcURL = config.property("storage.jdbcURL").getString() + (config.propertyOrNull("storage.dbFilePath")?.getString()?.let { File(it).canonicalFile.absolutePath } ?: "") } }The
init
function now acceptsApplicationConfig
and usesconfig.property
to load custom settings.Finally, open
com/example/Application.kt
and passenvironment.config
toDatabaseSingleton.init
to load connection settings on application startup:fun Application.module() { DatabaseSingleton.init(environment.config) configureRouting() configureTemplating() }
Enable connection pooling
To enable connection pooling in Exposed, you need to provide DataSource as a parameter to the Database.connect
function. HikariCP provides the HikariDataSource
class that implements the DataSource
interface.
To create
HikariDataSource
, opencom/example/dao/DatabaseSingleton.kt
and add thecreateHikariDataSource
function to theDatabaseSingleton
object:import com.zaxxer.hikari.* object DatabaseSingleton { private fun createHikariDataSource( url: String, driver: String ) = HikariDataSource(HikariConfig().apply { driverClassName = driver jdbcUrl = url maximumPoolSize = 3 isAutoCommit = false transactionIsolation = "TRANSACTION_REPEATABLE_READ" validate() }) }Here are some notes on the data source settings:
The
createHikariDataSource
function takes the driver class name and database URL as the parameters.The
maximumPoolSize
property specifies the maximum size the connection pool can reach.isAutoCommit
andtransactionIsolation
are set to sync with the default settings used by Exposed.
To use
HikariDataSource
, pass it to theDatabase.connect
function:object DatabaseSingleton { fun init(config: ApplicationConfig) { val database = Database.connect(createHikariDataSource(url = jdbcURL, driver = driverClassName)) } }You can now run the application and make sure everything works as before.
Caching
You can supplement a database with a database cache. Caching is a technique that enables storing the frequently used data in temporary memory and can reduce the workload for a database and the time to read the frequently-required data.
In this tutorial, we'll use the Ehcache library to organize the cache in a file.
Add a cache file path to the configuration
Open the src/main/resources/application.conf
file and add the ehcacheFilePath
property to the storage
group:
This property specifies the path to a file used to store the cache data. We'll use it later to configure a DAOFacade
implementation for working with a cache.
Implement caching
To implement caching, we need to provide another DAOFacade
implementation that returns a value from the cache and delegates it to the database interface if there is no cached value.
Create a new
DAOFacadeCacheImpl.kt
file in thecom.example.dao
package and add the following implementation to it:package com.example.dao import com.example.models.* import org.ehcache.config.builders.* import org.ehcache.config.units.* import org.ehcache.impl.config.persistence.* import java.io.* class DAOFacadeCacheImpl( private val delegate: DAOFacade, storagePath: File ) : DAOFacade { private val cacheManager = CacheManagerBuilder.newCacheManagerBuilder() .with(CacheManagerPersistenceConfiguration(storagePath)) .withCache( "articlesCache", CacheConfigurationBuilder.newCacheConfigurationBuilder( Int::class.javaObjectType, Article::class.java, ResourcePoolsBuilder.newResourcePoolsBuilder() .heap(1000, EntryUnit.ENTRIES) .offheap(10, MemoryUnit.MB) .disk(100, MemoryUnit.MB, true) ) ) .build(true) private val articlesCache = cacheManager.getCache("articlesCache", Int::class.javaObjectType, Article::class.java) }Here is a short overview of this code sample:
To initialize and configure the cache, we define an Ehcache
CacheManager
instance. We providestoragePath
as the root directory to be used for disk storage.We create a cache for entries that stores articles by their IDs:
articlesCache
mapsInt
keys toArticle
values.Then we provide size constraints for local memory and disk resources. You can read more about these parameters in the Ehcache documentation.
Finally, we obtain the created cache by calling
cacheManager.getCache()
with the provided name, key, and value types.
To be used in a cache, the
Article
class should be serializable and implementjava.io.Serializable
. Opencom/example/models/Article.kt
and update the code as follows:import java.io.Serializable data class Article(val id: Int, val title: String, val body: String): SerializableNow we're ready to implement the members of
DAOFacade
. Back inDAOFacadeCacheImpl.kt
, add the following methods:override suspend fun allArticles(): List<Article> = delegate.allArticles() override suspend fun article(id: Int): Article? = articlesCache[id] ?: delegate.article(id) .also { article -> articlesCache.put(id, article) } override suspend fun addNewArticle(title: String, body: String): Article? = delegate.addNewArticle(title, body) ?.also { article -> articlesCache.put(article.id, article) } override suspend fun editArticle(id: Int, title: String, body: String): Boolean { articlesCache.put(id, Article(id, title, body)) return delegate.editArticle(id, title, body) } override suspend fun deleteArticle(id: Int): Boolean { articlesCache.remove(id) return delegate.deleteArticle(id) }allArticles
: we don't try to cache all the articles; we delegate this to the main database.article
: when we get an article, we first check whether it's present in the cache, and only if it's not the case, we delegate this to the mainDAOFacade
and also add this article to the cache.addNewArticle
: when we add a new article, we delegate it to the mainDAOFacade
, but we also add this article to the cache.editArticle
: when editing the existing article, we update both the cache and the database.deleteArticle
: on delete, we need to delete the article both from the cache and from the main database.
Initialize DAOFacadeCacheImpl
Let's create an instance of DAOFacadeCacheImpl
and add a sample article to be inserted into the database before the application is started:
First, open the
DAOFacadeImpl.kt
file and remove thedao
variable initialization at the bottom of the file.Then, open
com/example/plugins/Routing.kt
and initialize thedao
variable inside theconfigureRouting
block:import kotlinx.coroutines.* import java.io.* fun Application.configureRouting() { val dao: DAOFacade = DAOFacadeCacheImpl( DAOFacadeImpl(), File(environment.config.property("storage.ehcacheFilePath").getString()) ).apply { runBlocking { if(allArticles().isEmpty()) { addNewArticle("The drive to develop!", "...it's what keeps me going.") } } } }And that's it. You can now run the application and make sure everything works as before.