Getting Started with Elasticsearch CRUD in Java
In this tutorial, we’ll cover creating, reading, updating, and deleting documents (CRUD operations) using Elasticsearch’s Java High-Level REST Client in a Spring Boot application.
Elasticsearch is a distributed, RESTful search and analytics engine capable of solving a growing number of use cases. It allows us to store, search, and analyze big volumes of data quickly and in near real-time.
- Index: An index is a collection of documents that have somewhat similar characteristics. For example, you can have an index for customer data, another for product catalogue data, and another for order data.
- Document: A document is a basic unit of information that can be indexed. It is a JSON object stored in an index and is similar to a row in a relational database.
- Shard: An index can be divided into multiple pieces called shards. Each shard is a fully functional, independent “sub-index” that can be hosted on any node in the cluster.
Setting Up Elasticsearch Using Docker
Step 1: Install Docker
Before we begin, ensure Docker is installed on your system. You can download Docker from the official Docker website.
Step 2: Pull the Elasticsearch Docker Image
Open your terminal and run the following command to pull the official Elasticsearch Docker image:
docker pull docker.elastic.co/elasticsearch/elasticsearch:7.17.22
Step 3: Run Elasticsearch Container
Next, run the Elasticsearch container with the following command:
docker run -d --name elasticsearch -p 9200:9200 -p 9300:9300 -e "discovery.type=single-node" docker.elastic.co/elasticsearch/elasticsearch:7.17.22
We map the port 9200 on the host to port 9200 on the container. This is the HTTP port used to communicate with Elasticsearch.
Step 4: Verify Elasticsearch is Running
Let’s check the logs of the running container to ensure Elasticsearch started correctly:
docker logs -f elasticsearch
Alternatively, we can test the Elasticsearch setup by making an HTTP request to the Elasticsearch server:
curl -X GET "localhost:9200/"
Setting Up the Elasticsearch CRUD Java Project
Let’s go through the steps to perform CRUD operations using Elasticsearch.
Step 1: Add Dependencies
Ensure our pom.xml
includes the following dependencies:
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
</dependency>
<!-- Elasticsearch client -->
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-high-level-client</artifactId>
<version>7.17.22</version>
</dependency>
<!-- JSON handling -->
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.17.1</version>
</dependency>
<!-- SLF4J for logging -->
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.32</version>
<scope>provided</scope>
</dependency>
</dependencies>
Next, configure the application.properties: Define Elasticsearch connection properties in src/main/resources/application.properties
:
elasticsearch.host=localhost
elasticsearch.port=9200
elasticsearch.username=your_username
elasticsearch.password=your_password
If you haven’t explicitly set up authentication and security in Elasticsearch, it might be running with default settings where authentication is not enabled. In such cases, Elasticsearch doesn’t require a username and password for REST API access.
We can query the cluster settings using the Elasticsearch API. Use tools like curl
or browser-based REST clients:
curl -XGET 'http://localhost:9200/_cluster/settings'
Step 2: Create the Product
Model
We need to create a Product
class to represent the product data.
public class Product {
private String id;
private String category;
private String name;
private String description;
private double price;
// constructor, getters and setters
}
Step 3: Create the Elasticsearch Client Service
Then, we create a service to manage the Elasticsearch client and perform CRUD operations.
@Service
public class ProductService {
private static final String INDEX = "products";
private static final String TYPE = "_doc";
private RestHighLevelClient client;
@Autowired
public ProductService(RestHighLevelClient client, ObjectMapper objectMapper) {
this.client = client;
this.objectMapper = new ObjectMapper();
}
// Create Product
public String createProduct(Product product) throws IOException {
Map<String, Object> productMap = objectMapper.convertValue(product, Map.class);
IndexRequest indexRequest = new IndexRequest(INDEX, TYPE, product.getId())
.source(productMap, XContentType.JSON);
IndexResponse indexResponse = client.index(indexRequest, RequestOptions.DEFAULT);
return indexResponse.getId();
}
// Get Product
public Product getProduct(String id) throws IOException {
GetRequest getRequest = new GetRequest(INDEX, TYPE, id);
GetResponse getResponse = client.get(getRequest, RequestOptions.DEFAULT);
if (getResponse.isExists()) {
Map<String, Object> sourceAsMap = getResponse.getSourceAsMap();
return objectMapper.convertValue(sourceAsMap, Product.class);
}
return null;
}
// Update Product
public String updateProduct(Product product) throws IOException {
Map<String, Object> productMap = objectMapper.convertValue(product, Map.class);
UpdateRequest updateRequest = new UpdateRequest(INDEX, TYPE, product.getId())
.doc(productMap);
UpdateResponse updateResponse = client.update(updateRequest, RequestOptions.DEFAULT);
return updateResponse.getId();
}
// Delete Product
public String deleteProduct(String id) throws IOException {
DeleteRequest deleteRequest = new DeleteRequest(INDEX, TYPE, id);
DeleteResponse deleteResponse = client.delete(deleteRequest, RequestOptions.DEFAULT);
return deleteResponse.getId();
}
}
Step 4: Create the Product Controller
Subsequently. we create a REST controller to handle HTTP requests and map them to ProductService
methods:
@RestController
@RequestMapping("/products")
public class ProductController {
@Autowired
private ProductService productService;
// Create Product
@PostMapping
public ResponseEntity<String> createProduct(@RequestBody Product product) {
try {
String productId = productService.createProduct(product);
return ResponseEntity.ok(productId);
} catch (IOException e) {
return ResponseEntity.status(500).body(e.getMessage());
}
}
// Get Product by ID
@GetMapping("/{id}")
public ResponseEntity<Product> getProduct(@PathVariable String id) {
try {
Product product = productService.getProduct(id);
if (product != null) {
return ResponseEntity.ok(product);
} else {
return ResponseEntity.notFound().build();
}
} catch (IOException e) {
return ResponseEntity.status(500).body(null);
}
}
// Update Product
@PutMapping("/{id}")
public ResponseEntity<String> updateProduct(@PathVariable String id, @RequestBody Product product) {
try {
product.setId(id);
String updatedProductId = productService.updateProduct(product);
return ResponseEntity.ok(updatedProductId);
} catch (IOException e) {
return ResponseEntity.status(500).body(e.getMessage());
}
}
// Delete Product
@DeleteMapping("/{id}")
public ResponseEntity<String> deleteProduct(@PathVariable String id) {
try {
String deletedProductId = productService.deleteProduct(id);
return ResponseEntity.ok(deletedProductId);
} catch (IOException e) {
return ResponseEntity.status(500).body(e.getMessage());
}
}
}
Step 5: Create Elasticsearch Configuration Class
We need to create a class named ElasticsearchConfig
in our Spring Boot project
@Configuration
public class ElasticsearchConfig {
@Value("${elasticsearch.host}")
private String host;
@Value("${elasticsearch.port}")
private int port;
@Value("${elasticsearch.username}")
private String userName;
@Value("${elasticsearch.password}")
private String password;
@Bean(destroyMethod = "close")
public RestHighLevelClient restClient() {
final CredentialsProvider credentialsProvider = new BasicCredentialsProvider();
credentialsProvider.setCredentials(AuthScope.ANY,
new UsernamePasswordCredentials(userName, password));
RestClientBuilder builder = RestClient.builder(new HttpHost(host, port))
.setHttpClientConfigCallback(httpClientBuilder -> httpClientBuilder.setDefaultCredentialsProvider(credentialsProvider));
RestHighLevelClient client = new RestHighLevelClient(builder);
return client;
}
}
If you’re configuring Elasticsearch without using a username and password for authentication:
@Bean(destroyMethod = "close")
public RestHighLevelClient restClient() {
RestHighLevelClient client = new RestHighLevelClient(
RestClient.builder(new HttpHost(host, port)));
return client;
}
Step 6: Verify the Elasticsearch CRUD Operations
Run Spring Boot Application: Navigate to our Spring Boot project directory in the terminal. Run the command below to build and run the application using Maven:
mvn -N io.takari:maven:wrapper
./mvnw spring-boot:run
Now that both Elasticsearch and Spring Boot applications are running, we can proceed to test the CRUD operations using curl
commands as previously outlined.
Create a Product:
curl -X POST -H "Content-Type: application/json" -d '{
"id": "1",
"name": "Product A",
"description": "This is a product A",
"category": "CategoryA",
"price": 99.99
}' http://localhost:8080/products
Get a Product:
curl http://localhost:8080/products/1
Response:
{"id":"1","category":"CategoryA","name":"Product A","description":"This is a product A","price":99.99}%
Update a Product:
curl -X PUT -H "Content-Type: application/json" -d '{
"id": "1",
"name": "Updated Product A",
"description": "This is a updated product A",
"price": 109.99
}' http://localhost:8080/products/1
Response when retrieved again:
{"id":"1","category":null,"name":"Updated Product A","description":"This is a updated product A","price":109.99}%
Delete a Product:
curl -X DELETE http://localhost:8080/products/1
Conclusion
In this tutorial, we’ve successfully set up Elasticsearch using Docker and created a Java project with Spring Boot to perform CRUD operations on Elasticsearch.
In our next post, we will explore how to synchronize MySQL with Elasticsearch using Logstash. This will allow us to keep our search index updated in real-time, ensuring that our application provides the most accurate and up-to-date search results, especially when dealing with large data sets.
Share this content:
Leave a Comment