I would like to introduce my exploration journey to improve my old Brain Piano project in a modern architecture stack utilizing Google Cloud Platform.
posted at: 15/9/2023
After gaining more knowledge on the cloud infrastructure-based micro-services architecture, I wanted to regain life for the Brain Piano Project. In 2019, I carried two laptops to run the demo - one to run a compiled version of Magenta code and another to run client-side javascript code - with limited knowledge of ML deployment.
Fig 1. Brain Piano backend hosted a pre-trained ML model (PolyphonyRNN) in a local server.
The most recent Magenta Repo suggest testing the model with the Colab environment due to the specific software(OS, Python environment) dependencies.
Magenta is an open source research project introduced by Google Brain team in 2017, exploring the role of machine learning as a tool in the creative process.
However, Colab executes codes in a runtime environment, which is unsuitable for my use case. Moreover, I found that the docker environment, which used to be introduced as an alternative installation method, is moved from their official documentation.
Thankfully, I came across this amazing repo by xychelsea who recreated Magenta docker-environment. This is my version of docker-compose.yml file to get it work with MacOS(Ventura 13.5.1, M1).
1
2# docker-compose.yml file
3version: "3.7"
4
5networks:
6 default:
7 driver: bridge
8
9services:
10 portainer:
11 image: portainer/portainer-ce:latest
12 ports:
13 - "9000:9000"
14 restart: always
15 volumes:
16 - portainer-data:/data
17 - /var/run/docker.sock:/var/run/docker.sock
18 magenta:
19 platform: linux/amd64
20 image: xychelsea/magenta:latest-jupyter
21 container_name: magenta
22 restart: always
23 ports:
24 - "8888:8888"
25 volumes:
26 - ./volume/magenta:/usr/local/magenta/workspace
27volumes:
28 portainer-data:
29
Portainer is a GUI for docker management, and it is a dockerized webapp itself.
I added the platform flag that allows for specific platform(linux/amd62)
I decided to deploy the code to the GCP free instance(e2-micro) to create the publicly accessible server.
To receive the command from the front-end and deliver the message to the dockerized magenta code, I created a simple FlaskAPI server.
The generated midi file from the magenta is then uploaded to Google Drive to share input/output files with the front-end. The diagram below summarizes what is happening inside the GCP instance.
Portainer is a GUI for docker management, and it is a dockerized webapp itself.
I added the platform flag that allows for specific platform(linux/amd62)
This time, I will not cover the specific steps to deploy the docker container to the GCP instance. Instead, I recommend reading this article to follow through with the details.
The picture below shows the public webpage deployed to the GCP instance. This swagger UI webpage shows the HTTP endpoints of Flask API.
When receiving HTTP GET request in /generate endpoint, the backend server generates classical music in the style of Bach with the melody line provided.
Open this link to listen to the generated music pieces
However, music generation process is extremely slow - it took around 30 seconds to generate the 10 seconds of music, and it is due to the limited hardware specification of the free GCP instance.(2 vCPU | RAM 1024 MiB | Storage 10Gb Standard (Non-SSD))
I started to search better options to host the ML service, this journey will be covered in the next post.
Detailed instruction and source code can be found from my github repo.