A year ago I worked on a web service that had Postgres and Elasticsearch as backends. Postgres was doing most of the
work and was the primary source of truth about all data, but some documents were replicated in Elasticsearch for
querying. Elasticsearch was easy to get started with, but had an ongoing maintenance cost: it was one more moving part
to break down, it occasionally went out of sync with the main database, it was another thing for new developers to
install, and it added complexity to the deployment, as well as the integration tests. But most of the features of
Elasticsearch weren’t needed because the documents were semi-structured, and the search queries were heavily
keyword-based. Dropping Elasticsearch and just using Postgres turned out to work okay. No, I’m not talking about
brute-force string matching using LIKE
expressions (as
implemented in certain popular CMSs); I’m talking about using the featureful text search indexes in good modern
databases. Text search with Postgres took more work to implement, and couldn’t do all the things Elasticsearch
could, but it was easier to deploy, and since then it’s been zero maintenance. Overall, it’s considered a net win (I
talked to some of the developers again just recently).