n this, tutorial we will build an application with a client, server, and database, in a traditional way, and containerize them with Docker.
This tutorial assumes that you have
· PostgreSQL
· Docker
· Docker-Compose
Installed locally.
1. Building the Application without Containerization
1.1 Creating the Server
Make a separate directory in your workspace
// Make a separate directory in your workspace mkdir tutorial // change directory into newly created directory cd tutorial // Make a server directory mkdir server
change directory into newly created directory
cd tutorial
Make a server directory
mkdir server
change directory into server directory
cd server
Setting up node server with Babel
Initialize package.json
npm init -y
First we’ll install @babel/cli, @babel/core and @babel/preset-env.
npm install --save @babel/runtime
npm install --save-dev @babel/cli @babel/core @babel/preset-env @babel/plugin-transform-runtime
Then we’ll create a .babelrc file for configuring babel.
touch .babelrc
This will host any options we might want to configure babel with.
{ 'presets': ['@babel/preset-env'], 'plugins': [ ['@babel/transform-runtime'] ] }
Then create our server in index.js.
touch index.js import http from 'http'; const server = http.createServer((req, res) => { res.writeHead(200, {'Content-Type': 'text/plain'}); res.end('Hello World '); }).listen(5000, '127.0.0.1'); console.log('Server running at http://127.0.0.1:5000/'); export default server;
With recent changes to babel, you will need to transpile your ES6 before node can run it.
So, we’ll add our first script, build, in package.json.
... 'scripts': { 'build': 'babel index.js -d dist', 'start': 'npm run build && node dist/index.js', 'test': 'echo 'Error: no test specified' && exit 1' }, ...
Now let’s start our server.
npm start
You should now be able to visit http://127.0.0.1:5000 and see Hello World.
Watching file changes with nodemon
We can improve our npm start script with nodemon.
npm install --save-dev nodemon
Then we can add a npm run dev script.
... 'scripts': { 'build': 'babel index.js -d dist', 'dev': 'nodemon --watch index.js --exec npm start', 'start': 'npm run build && node dist/index.js', 'test': 'echo 'Error: no test specified' && exit 1' }, ...
Then we’ll restart our server.
npm run dev
You should now be able to make changes to index.js and our server should be restarted automatically by nodemon.
Go ahead and replace Hello World with Hello {{YOUR_NAME_HERE}} while our server is running.
If you visit http://127.0.0.1:5000 you should see our server greeting you.
Getting ready for production use
First let’s move our server index.js file to lib/index.js.
mkdir lib mv index.js lib/index.js
And update our npm start script to reflect the location change.
... 'scripts': { 'build': 'babel lib -d dist', 'dev': 'nodemon --watch index.js --exec npm start', 'start': 'npm run build && node dist/index.js', 'test': 'echo 'Error: no test specified' && exit 1' }, ...
Next let’s add a new task: npm run serve and update npm start and npm run dev.
... 'scripts': { 'build': 'babel lib -d dist', 'dev': 'nodemon --watch lib --exec npm start', 'start': 'npm run build && npm run serve', 'serve': 'node dist/index.js' 'test': 'echo 'Error: no test specified' && exit 1' }, ...
Now we can use npm run build for precompiling our assets, and npm run serve for starting our server in production.
npm run build
npm run serve
This means we can quickly restart our server without waiting for babel to recompile our files.
Oh, let’s not forget to add dist to our .gitignore file:
touch .gitignore
dist node_modules
This will make sure we don’t accidentally commit our built files and node_modules to git.
Testing the server
Let’s install mocha.
npm install --save-dev mocha
And create our test in test/index.js.
mkdir test touch test/index.js import http from 'http'; import assert from 'assert'; import server from '../lib/index.js'; describe('Example Node Server', () => { it('should return 200', done => { http.get('http://127.0.0.1:5000', res => { assert.equal(200, res.statusCode); server.close(); done(); }); }); });
Next, install @babel/register for the require hook.
npm install --save-dev @babel/register
Then we can update the npm test script.
... 'scripts': { 'build': 'babel lib -d dist', 'dev': 'nodemon --watch index.js --exec npm start', 'start': 'npm run build && npm run serve', 'serve': 'node dist/index.js' 'test': 'mocha --exit --require @babel/register' }, ...
Now let’s run our tests.
npm test
You should see the following:
Server running at http://127.0.0.1:5000/
Example Node Server
✓ should return 200
1 passing (43ms)
Setting Up Express Server
Install express, pg, body-parser, cors npm modules
npm install --save express pg body-parser cors
Create config.js file to import environment variables
touch lib/config.js export default { pgUser: process.env.PGUSER, pgHost: process.env.PGHOST, pgDatabase: process.env.PGDATABASE, pgPassword: process.env.PGPASSWORD, pgPort: process.env.PGPORT };
Update lib/index.js
import config from './config' // Express App Setup import express from 'express'; import bodyParser from 'body-parser'; import cors from 'cors'; const app = express(); app.use(cors()); app.use(bodyParser.json()); // Postgres Client Setup import { Pool } from 'pg'; const pgClient = new Pool({ user: config.pgUser, host: config.pgHost, database: config.pgDatabase, password: config.pgPassword, port: config.pgPort }); pgClient.on('error', () => console.log('Lost PG connection')); pgClient .query('CREATE TABLE IF NOT EXISTS todos (id SERIAL PRIMARY KEY, content VARCHAR NOT NULL, done BOOLEAN NOT NULL DEFAULT FALSE)') .catch(err => console.log(err)); // Express route handlers app.get('/', (req, res) => { res.send('Hello World'); }); app.get('/todos', async (req, res) => { try { const todos = await pgClient.query('SELECT * from todos ORDER BY id DESC'); res.send({ status: 'OK', data: { todos: todos.rows } }); } catch(err) { res.status(422).send({ status: 'ERR', message: err }) } }); app.post('/todos', async (req, res) => { try { const content = req.body.content; const response = await pgClient.query('INSERT INTO todos(content) VALUES($1) RETURNING *', [content]); res.send({ status: 'OK', data: { todo: response.rows[0] } }); } catch(err) { res.status(422).send({ status: 'ERR', message: err }) } }); app.delete('/todos/:id', async (req, res) => { try { const id = req.params.id const response = await pgClient.query(`DELETE FROM todos WHERE id = ${id}`); res.send({ status: 'OK', message: response }); } catch(err) { res.status(422).send({ status: 'ERR', message: err }) } }); app.post('/todos/:id/check', async (req, res) => { try { const id = req.params.id const response = await pgClient.query(`UPDATE todos SET done = TRUE WHERE id = ${id}`); res.send({ status: 'OK', message: response }); } catch(err) { res.status(422).send({ status: 'ERR', message: err }) } }); app.delete('/todos/:id/check', async (req, res) => { try { const id = req.params.id const response = await pgClient.query(`UPDATE todos SET done = FALSE WHERE id = ${id}`); res.send({ status: 'OK', message: response }); } catch(err) { res.status(422).send({ status: 'ERR', message: err }) } }); export default app.listen(5000, err => { console.log('Listening'); });
1.2 Creating the client
Change directory into tutorial
cd tutorial
Generate a new react app with CRA
create-react-app client
Change directory into client
cd client
Install axios for communication between client and server
npm install --save axios
Add TodoList.js, TodoItem.js, TodoItemMaker.js into client/src, and update App.js
TodoList.js import React, { Component } from 'react'; import axios from 'axios' import TodoItem from './TodoItem' import TodoItemMaker from './TodoItemMaker' let url = '/api/todos/' let instance = axios // --- should only be used for testing local, // --- delete these when building containers url = '/todos/' instance = axios.create({ baseURL: 'http://localhost:5000' }); // --------------------------- //
export default class extends Component { state = { todos: [] } componentDidMount() { this.fetchTodos(); } async fetchTodos() { try { const response = await instance.get(url); const todos = response.data.data.todos this.setState({ todos }); } catch(err) { } } deleteTodo = async (deletingTodo) => { try { const response = await instance.delete(`${url}${deletingTodo.id}`); if (response.data.status === 'OK') { this.setState({ todos: this.state.todos.filter(todo => todo.id !== deletingTodo.id) }); } } catch(err) { } } checkTodo = async (checkingTodo) => { try { const response = await instance.post(`${url}${checkingTodo.id}/check`); if (response.data.status === 'OK') { this.setState({ todos: this.state.todos.map(todo => { if (todo.id === checkingTodo.id) return {...todo, done: true} return todo }) }); } } catch(err) { } } uncheckTodo = async (uncheckingTodo) => { try { const response = await instance.delete(`${url}${uncheckingTodo.id}/check`); if (response.data.status === 'OK') { this.setState({ todos: this.state.todos.map(todo => { if (todo.id === uncheckingTodo.id) return {...todo, done: false} return todo }) }); } } catch(err) { } } createTodo = async (content, onSuccess) => { try { const response = await instance.post(url, { content }); if (response.data.status === 'OK') { const newTodo = response.data.data.todo this.setState({ todos: [newTodo, ...this.state.todos] }); onSuccess() } } catch(err) { } } renderTodoList(todos) { return todos.map(todo =>) } renderUncheckedTodoList() { const todos = this.state.todos return this.renderTodoList(todos.filter(todo => !todo.done)) } renderCheckedTodoList() { const checkedTodos = this.state.todos.filter(todo => todo.done) if (checkedTodos.length === 0) { return } return ( ); } render() { return (===Done==={this.renderTodoList(checkedTodos)}); } } TodoItem.js import React from 'react'; export default ({ todo, deleteTodo, checkTodo, uncheckTodo }) => { return ( {this.renderUncheckedTodoList()}{this.renderCheckedTodoList()}{todo.id}. {todo.content} {todo.done ? : }); }
TodoItemMaker.js import React, { Component } from 'react'; export default class extends Component { state = { text: '' } emptyInput = () => { this.setState({ text: '' }) } handleChange = (event) => { event.preventDefault() this.setState({ text: event.target.value }) } onEnter = (event) => { if(event.key === 'Enter') { this.props.createTodo(event.target.value, this.emptyInput) } } render() { const { text } = this.state return } } App.js import React from 'react'; import './App.css'; import TodoList from './TodoList' function App() { return (); } export default App;
1.3 Testing the app
Open terminal window in tutorial/server directory,
export PGUSER=your_home_folder_name PGHOST=localhost PGDATABASE=postgres PGPASSWORD=null PGPORT=5432
npm run dev
Open terminal window in tutorial/client directory,
npm start
it will launch a new window on localhost:3000 and you should be able to interact with the application.
2. Containerizing the Application and Running Them with docker-compose
Copy tutorial directory
cd tutorial
cp -r tutorial dev-container
cd into dev-container
cd dev-container
2.1 Containerizing the Client
cd into client
cd client
Remove node_modules folder
rm -rf node_modules
Update TodoList.js by deleting these lines
// --- should only be used for testing local, // --- delete these when building containers url = '/todos/' instance = axios.create({ baseURL: 'http://localhost:5000' }); // --------------------------- //
Add Dockerfile.dev in client directory
touch Dockerfile.dev FROM node:alpine WORKDIR '/app' COPY ./package.json ./ RUN npm install COPY . . CMD ['npm', 'run', 'start']
In client directory, run docker build -f Dockerfile.dev . to create a client docker image
It will take a while and when the image is finished building, it will output a message like this, Successfully built 4089dbaa23e6
Then, you can use that image id to run the client container
docker run 4089dbaa23e6
the client container is not connected to the server yet, and it is not yet accessible through localhost, but you should be able to see from the terminal that it is running. After you have confirmed that the client server is running, go ahead and exit with ctrl + c
2.2 Containerizing the Server
cd into server
cd server
Remove node_modules folder
rm -rf node_modules
Add Dockerfile.dev in server directory
touch Dockerfile.dev FROM node:alpine WORKDIR '/app' COPY ./package.json ./ RUN npm install COPY . . CMD ['npm', 'run', 'dev']
In server directory, run docker build -f Dockerfile.dev . to create a server docker image
It will take a while and when the image is finished building, it will output a message like this, Successfully built d3b8ce4d1ce1
Then, you can use that image id to run the server container
docker run d3b8ce4d1ce1
Once you can see from the terminal that it cannot connect to postgreSQL, go ahead and exit with ctrl + c
2.3 Adding Nginx, and PostgreSQL with docker-compose
Change directory into container-dev
cd container-dev
Create nginx directory
mkdir nginx
Change directory into cd nginx
cd nginx
touch default.conf upstream client { server client:3000; } upstream api { server api:5000; } server { listen 80; location / { proxy_pass http://client; } location /sockjs-node { proxy_pass http://client; proxy_http_version 1.1; proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection 'Upgrade'; } location /api { rewrite /api/(.*) /$1 break; proxy_pass http://api; } }
touch Dockerfile.dev FROM nginx COPY ./default.conf /etc/nginx/conf.d/
Change directory into container-dev
cd container-dev touch docker-compose.yaml version: '3' services: postgres: image: 'postgres:latest' nginx: restart: always build: dockerfile: Dockerfile.dev context: ./nginx ports: - '3050:80' api: build: dockerfile: Dockerfile.dev context: ./server volumes: - /app/node_modules - ./server:/app environment: - PGUSER=postgres - PGHOST=postgres - PGDATABASE=postgres - PGPASSWORD=postgres_password - PGPORT=5432 client: build: dockerfile: Dockerfile.dev context: ./client volumes: - /app/node_modules - ./client:/app
In container-dev directory, run, docker-compose up
the application is not likely to start up at the first try, so pressctrl + c and try running docker-compose up again.
After it has built all the images, it will start the application in your default browser, and you should be able to access the UI onhttp://localhost:3050.
Whenever you want to rebuild the docker images, you can use
docker-compose up --build for rebuilding all the images.
The benefit of setting up volumes in with docker-compose is that whenever you make a change in your source code for client and server directory, it will be immediately reflected.
Next time, We will look into deploy this application locally on Kubernetes via Minikube.
Mondrian AI
인공지능 / 빅데이터 / 데이터 분석 및 시각화 기업
인천시 송도동 미추홀타워 7층
https://mondrian.ai
032-713-7984
contact@mondrian.ai