repaired docker, added docker-compose
This commit is contained in:
parent
c17aa14601
commit
24d287cd90
27
Dockerfile
27
Dockerfile
@ -1,4 +1,6 @@
|
|||||||
FROM ubuntu:14.04
|
FROM ubuntu:18.04
|
||||||
|
|
||||||
|
ENV DEBIAN_FRONTEND=noninteractive
|
||||||
|
|
||||||
RUN apt-get update && apt-get install -y \
|
RUN apt-get update && apt-get install -y \
|
||||||
pkg-config \
|
pkg-config \
|
||||||
@ -8,19 +10,26 @@ RUN apt-get update && apt-get install -y \
|
|||||||
libblas-dev \
|
libblas-dev \
|
||||||
liblapack-dev \
|
liblapack-dev \
|
||||||
libatlas-base-dev \
|
libatlas-base-dev \
|
||||||
|
libsndfile1-dev \
|
||||||
|
libasound2-dev \
|
||||||
|
libjack-dev \
|
||||||
gfortran \
|
gfortran \
|
||||||
|
ffmpeg \
|
||||||
|
llvm-8 \
|
||||||
python \
|
python \
|
||||||
python-dev \
|
python3 \
|
||||||
python-pip \
|
python3-dev \
|
||||||
|
python3-pip \
|
||||||
|
python3-venv \
|
||||||
|
nvidia-cuda-dev \
|
||||||
curl && \
|
curl && \
|
||||||
curl -sL https://deb.nodesource.com/setup_7.x | sudo -E bash - && \
|
curl -sL https://deb.nodesource.com/setup_10.x | bash - && \
|
||||||
apt-get install -y nodejs
|
apt update && apt-get install -y nodejs && apt clean
|
||||||
|
|
||||||
|
RUN pip3 install --upgrade pip
|
||||||
RUN pip install -U https://storage.googleapis.com/tensorflow/linux/cpu/tensorflow-0.12.1-cp27-none-linux_x86_64.whl
|
|
||||||
|
|
||||||
COPY ./server/requirements.txt /tmp/
|
COPY ./server/requirements.txt /tmp/
|
||||||
RUN pip install -r /tmp/requirements.txt
|
RUN pip3 install -r /tmp/requirements.txt
|
||||||
|
|
||||||
COPY . /src/
|
COPY . /src/
|
||||||
|
|
||||||
@ -30,4 +39,4 @@ RUN npm install && npm run build
|
|||||||
WORKDIR /src/server/
|
WORKDIR /src/server/
|
||||||
|
|
||||||
EXPOSE 8080
|
EXPOSE 8080
|
||||||
ENTRYPOINT python server.py
|
ENTRYPOINT python3 server.py
|
||||||
|
38
README.md
38
README.md
@ -18,35 +18,6 @@ Built by [Yotam Mann](https://github.com/tambien) with friends on the Magenta an
|
|||||||
|
|
||||||
A.I. Duet is composed of two parts, the front-end which is in the `static` folder and the back-end which is in the `server` folder. The front-end client creates short MIDI files using the players's input which is sent to a [Flask](http://flask.pocoo.org/) server. The server takes that MIDI input and "continues" it using [Magenta](https://github.com/tensorflow/magenta) and [TensorFlow](https://www.tensorflow.org/) which is then returned back to the client.
|
A.I. Duet is composed of two parts, the front-end which is in the `static` folder and the back-end which is in the `server` folder. The front-end client creates short MIDI files using the players's input which is sent to a [Flask](http://flask.pocoo.org/) server. The server takes that MIDI input and "continues" it using [Magenta](https://github.com/tensorflow/magenta) and [TensorFlow](https://www.tensorflow.org/) which is then returned back to the client.
|
||||||
|
|
||||||
## INSTALLATION
|
|
||||||
|
|
||||||
A.I. Duet only works with [Python 2.7](https://www.python.org/download/releases/2.7/) and it was tested with Node v6. There are two basic ways of installing A.I. Duet: with Docker or without Docker.
|
|
||||||
|
|
||||||
If you already have a Python environment setup, install all of the server dependencies and start the server by typing the following in the terminal:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd server
|
|
||||||
pip install -r requirements.txt
|
|
||||||
```
|
|
||||||
|
|
||||||
If this does not work, jump down to the [Docker](#docker) installation instructions, which will walk you through installing A.I. Duet within a Docker container.
|
|
||||||
|
|
||||||
If it _did_ install tensorflow and magenta successfully, you can run the server by typing:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
python server.py
|
|
||||||
```
|
|
||||||
|
|
||||||
Then to build and install the front-end Javascript code, first make sure you have [Node.js](https://nodejs.org) 6 installed. And then install of the dependencies of the project and build the code by typing the following in the terminal:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd static
|
|
||||||
npm install
|
|
||||||
npm run build
|
|
||||||
```
|
|
||||||
|
|
||||||
You can now play with A.I. Duet at [localhost:8080](http://localhost:8080).
|
|
||||||
|
|
||||||
## DOCKER
|
## DOCKER
|
||||||
|
|
||||||
[Docker](https://www.docker.com/) is an open-source containerization software which simplifies installation across various OSes. It is the simplest method to build and install both the front-end and back-end components. Once you have Docker installed, you can just run:
|
[Docker](https://www.docker.com/) is an open-source containerization software which simplifies installation across various OSes. It is the simplest method to build and install both the front-end and back-end components. Once you have Docker installed, you can just run:
|
||||||
@ -58,6 +29,15 @@ $ sudo docker run -t -p 8080:8080 ai-duet
|
|||||||
|
|
||||||
You can now play with A.I. Duet at [localhost:8080](http://localhost:8080).
|
You can now play with A.I. Duet at [localhost:8080](http://localhost:8080).
|
||||||
|
|
||||||
|
## DOCKER-COMPOSE
|
||||||
|
|
||||||
|
```bash
|
||||||
|
$ docker-compose build
|
||||||
|
$ docker-compose up -d
|
||||||
|
```
|
||||||
|
|
||||||
|
You can now play with A.I. Duet at [localhost:8080](http://localhost:8080).
|
||||||
|
|
||||||
## MIDI SUPPORT
|
## MIDI SUPPORT
|
||||||
|
|
||||||
The A.I. Duet supports MIDI keyboard input using [Web Midi API](https://webaudio.github.io/web-midi-api/) and the [WebMIDI](https://github.com/cotejp/webmidi) library.
|
The A.I. Duet supports MIDI keyboard input using [Web Midi API](https://webaudio.github.io/web-midi-api/) and the [WebMIDI](https://github.com/cotejp/webmidi) library.
|
||||||
|
19
docker-compose.yml
Normal file
19
docker-compose.yml
Normal file
@ -0,0 +1,19 @@
|
|||||||
|
version: '2'
|
||||||
|
|
||||||
|
services:
|
||||||
|
aiduet:
|
||||||
|
build: ./
|
||||||
|
image: aiduet
|
||||||
|
container_name: aiduet
|
||||||
|
hostname: aiduet
|
||||||
|
restart: "no"
|
||||||
|
environment:
|
||||||
|
- NVIDIA_VISIBLE_DEVICES=all
|
||||||
|
devices:
|
||||||
|
- /dev/nvidia0
|
||||||
|
- /dev/nvidiactl
|
||||||
|
- /dev/nvidia-uvm
|
||||||
|
- /dev/nvidia-uvm-tools
|
||||||
|
cap_add:
|
||||||
|
- IPC_LOCK
|
||||||
|
network_mode: host
|
@ -1,5 +1,6 @@
|
|||||||
tensorflow==0.12.1
|
tensorflow-gpu==1.14.0
|
||||||
magenta==0.1.8
|
magenta==1.1.8
|
||||||
Flask==0.12
|
numba==0.53.1
|
||||||
gunicorn==19.6.0
|
Flask
|
||||||
ipython==5.1.0
|
gunicorn
|
||||||
|
ipython
|
||||||
|
@ -22,7 +22,7 @@ import sys
|
|||||||
if sys.version_info.major <= 2:
|
if sys.version_info.major <= 2:
|
||||||
from cStringIO import StringIO
|
from cStringIO import StringIO
|
||||||
else:
|
else:
|
||||||
from io import StringIO
|
from io import StringIO, BytesIO
|
||||||
import time
|
import time
|
||||||
import json
|
import json
|
||||||
|
|
||||||
@ -34,7 +34,8 @@ app = Flask(__name__, static_url_path='', static_folder=os.path.abspath('../stat
|
|||||||
def predict():
|
def predict():
|
||||||
now = time.time()
|
now = time.time()
|
||||||
values = json.loads(request.data)
|
values = json.loads(request.data)
|
||||||
midi_data = pretty_midi.PrettyMIDI(StringIO(''.join(chr(v) for v in values)))
|
# midi_data = pretty_midi.PrettyMIDI(StringIO(''.join(chr(v) for v in values)))
|
||||||
|
midi_data = pretty_midi.PrettyMIDI(BytesIO(b''.join((v).to_bytes(1, 'little') for v in values)))
|
||||||
duration = float(request.args.get('duration'))
|
duration = float(request.args.get('duration'))
|
||||||
ret_midi = generate_midi(midi_data, duration)
|
ret_midi = generate_midi(midi_data, duration)
|
||||||
return send_file(ret_midi, attachment_filename='return.mid',
|
return send_file(ret_midi, attachment_filename='return.mid',
|
||||||
|
Loading…
Reference in New Issue
Block a user