Running Dynomite on AWS with Docker in multi-host network Overlay

Dynomite is a kick-ass project. Basically, allow you to have strong consistency on top of NoSQL Databases. I've been using dynomite for a while in production(AWS) and I can say the core is rock solid and it just works.

Lots of developers use Windows or Mac for instance and dynomite is built in C and it's really meant for Linux(Like all good things).  So some time ago I made 2 simple projects to get started quickly with dynomite.  Basically, the project creates a simple dynomite 3 node cluster and let you run on your local machine with docker.

There are 2 projects - One to create a dynomite cluster with Redis -- The other with Facebook's RocksDB(Experimental). So you can use it on your local machine to Debug and it works just fine. So why not go 1 step further and run Dynomite in AWS using docker? There are cool benefits if you do this approach.

Running Dynomite on AWS with Docker

Dynomite works very well on AWS but also in any other cloud-vendor or Bare metal DC. Dynomite runs on Docker just fine too. Now you can choose to run on EC2, ECS, Kubernetes on EC2 or even EC2 with docker.

The Benefits

There are many advantages do run Dynomite with docker on aws.

Here are some Benefits -- The good things:
 - COST Savings: Since you can benefit from your reservation and do better resource utilization.
 - Less Latency: Running on docker allow you to easily deploy on the same box as application and reduce network roundtrips.
 - Portability: Same docker image can be used to run anywhere also from the developer machine.
 
The Cons

Like everything in life, there are pros and cons. Here are some I found:

- Networking: Docker networking can get very tricky and hard to maintain.
- Size Limitations: Default network in /24 so it's limited to 256 ips. Offcourse you can create more networks.
- More Complex: You will have docker, docker cluster(swarm), docker network(overlay) to managed so there are more moving points of failure compared with just running dynomite on EC2 for instance.

Getting Started 

Now we will install Docker, Docker Swarm, Configure a docker cluster, Create a network overlay and run dynomite in a cluster in Docker on Ec2. Phew! Long list. :-)



We will do something very silly and simple. So will deploy a 3 node cluster. This cluster won't have sharding(You can have sharding on dynomite - just dependents on seeds config - for sake of simplicity we will not do it) or cold bootstrapping or S3 backups - If you are interested in this feature you should take a look in Dynomite-Manager.

Basically, we need do the following steps in order to get this working. These are the steps:
 1.  Create EC2 instances(Let's say 2) - Later you can automate(Ansible, Boto3, Terraform, whatever)
 2.  Create Security Groups(Use the same SG for all ec2 instances) like sg_dynomite_docker.
 3. You need open ports(SG): 8101, 8102, 6379, 2377 and any others your app might need.
 4. Them we ssh to the box and install Docker
 5. Install docker Swarm - become a master - Docker will give you the command.
 6. Do ssh to the other box and they join the master swarm node.
 7. Create a Docker network with overlay - make sure it's attachable.
 8. Configure dynomite YAML files to use fixed ips on the docker network overlay.
 9. Do docker run and run docker dynomite container 2x in 1 host
10. Do docker run and run docker dynomite on another host. That's it.

You can use my dynomite-docker project as a starting point and make the changes there because there are configs and Dockerfile done you just need change the IPs and remove the volume mapping and make sure you create the docker network with overlay as that's it. Here there is a https://gist.github.com/diegopacheco/6c75a445337e1ac29fd9ae07a16e2500 sample snipper that might help you.

Cheers,
Diego Pacheco

Popular posts from this blog

Telemetry and Microservices part2

Installing and Running ntop 2 on Amazon Linux OS

Fun with Apache Kafka