This post is also available at:
Português

When we work with containers, we often need a development or testing environment that reflects production. In my case, I needed to create a customized Docker image containing databases and collections from a MongoDB instance in production.
To do this, I used a shell script to export and import the data. First, I present the script and then I explain how to incorporate it into a Docker image.
MongoDB Data Export and Import with Shell Script
As the import takes place inside a Docker container, no authentication (user/password) was required in the import process. In addition, to avoid an excessive volume of data in the test environment, I limited the export to 100 documents per collection – this value can be adjusted as required.
Below is the script used to export the data from a MongoDB instance and then import it into a Docker container:
### MONGOEXPORT
candidates=$(echo "show databases" | mongo -u <usuario> --host <hostname>:27017 --authenticationDatabase admin -p <senha> | grep -Ev "^(MongoDB|connecting|admin|local|test|bye)" | awk '{print $1}')
for candidate in $candidates; do
collections=`echo "show collections" | mongo -u <usuario> --host <hostname>:27017 --authenticationDatabase admin -p <senha> $candidate --quiet`
var=0
for collection in $collections; do
var=$((var + 1))
mongoexport -u <usuario> --host <hostname>:27017 --authenticationDatabase admin -p <senha> --db $candidate -c $collection --limit=100 --out json/$var"_"$candidate"_"$collection"_export.json"
done
done
ShellScript### MONGOIMPORT
for i in `ls json`;
do
database=`echo $i | awk -F "__" '{print $2}'`
collection=`echo $i | awk -F "__" '{print $3}'`
mongoimport -d $database -c $collection --file json/$i
done
ShellScript