Skip to main content

MongoDB fetch operation using Java API

With changes in MongoDB, there has been several changes in its Java API as well. There are now some good and easy way to perform different operation with DB.

MongoDB Java API is really very simple and easy to understand. If we understand the basics, we can build up simple to complex queries.

Let's take a look at the example of MongoDB fetch operation using the new Java API and then we'll try to understand the basics..


public MyEntity findMyEntityById(long entityId) {

    List<Bson> queryFilters = new ArrayList<>();
    queryFilters.add(Filters.eq("_id", entityId));
    Bson searchFilter = Filters.and(queryFilters);
    //Fields to return. 
    //_id is available by default. A value of "0" would skip it
    List<Bson> returnFilters = new ArrayList<>();
    returnFilters.add(Filters.eq("name", 1));
    returnFilters.add(Filters.eq("_id", 0)); //Only if don't need _id in response
    Bson returnFilter = Filters.and(returnFilters);

    //Perform Fetch operation
    Document doc = getMongoCollection().find(searchFilter).projection(returnFilter).first();
    //Deserialize the document into the Entity object
    JsonParser jsonParser = new JsonFactory().createParser(doc.toJson());
    ObjectMapper mapper = new ObjectMapper();

    MyEntity myEntity = mapper.readValue(jsonParser, MyEntity.class);
    return myEntity;
}

What all we need to do is to prepare following things:

  • queryFilter: criteria to filter records. Like where clause in SQL
  • returnFilter: This is known as projection in MongoDB terms. This filter represents all we need in return from fetch operation. Specify the fields that you need or just skip it all if you need all, based on your requirement or performance considerations.


Once you get the records as Document, you can further use Jackson to de-serialize it to the related Entity object.

Comments

Popular posts from this blog

MongoDB BulkWrite Java API

Since version 3.2, MongoDB has introduced Bulk Update methods. In context of RDBMS, it's like SQL Batch Jobs, where SQL Statements are prepared in different chunks and a batch of statements are submitted to DB for update/insert. Here are some important points about MongoDB Bulk Write operation.. Useful in case you've huge data to update/insert. Mongo automatically prepares batches (of 1000 default) and start execution in an ordered/unordered manner. This drastically reduce DB trip time. Let's say there are 50 thousand records to update, now instead of 50k round trips to DB from your app server, using Bulk Update it would be reduced to just 50 round trips. Let's see an example below: List<WriteModel<Document>> updateDocuments = new ArrayList<WriteModel<Document>>(); for ( Long entityId : entityIDs ) { //Finder doc Document filterDocument = new Document (); filterDocument . append ( "_id" , ent

MongoDB Aggregation using Java API

A very common problem scenario in programming is to get the records or record count by certain fields. For developers familiar with RDBMS, it's like creating a SQL with combination of count function and group by attributes. For MongoDB too, it's very similar. Let's look at the example below fetching no of employees group by department Ids. public Map < Long , Integer > getEmployeeCountMapByDeptId () { Map < Long , Integer > empCountMap = new HashMap <>(); AggregateIterable < Document > iterable = getMongoCollection (). aggregate ( Arrays . asList ( new Document ( "$match" , new Document ( "active" , Boolean . TRUE ) . append ( "region" , "India" )), new Document ( "$group" , new Document ( "_id" , "$" + "deptId" ). append ( "count"