admin管理员组

文章数量:1188429

The use case for this is branch building and deployments in Lerna monorepos.

The problem is that Lerna monorepos either hoist dependencies in NPM or use yarn workspaces to the same effect to collect all dependencies in the node_modules folder of the workspace/monorepo. Which means that they will not be accessible when building Dockerfiles in subfolders due to how docker build contexts work.

I imagine what is needed here is a kind of "lower" (as opposed to hoist) function to pull package dependencies into the node_modules of the Docker/package.json project before running docker build.

The question is, does anyone have a better idea, or know of an already existing method to do this?

The use case for this is branch building and deployments in Lerna monorepos.

The problem is that Lerna monorepos either hoist dependencies in NPM or use yarn workspaces to the same effect to collect all dependencies in the node_modules folder of the workspace/monorepo. Which means that they will not be accessible when building Dockerfiles in subfolders due to how docker build contexts work.

I imagine what is needed here is a kind of "lower" (as opposed to hoist) function to pull package dependencies into the node_modules of the Docker/package.json project before running docker build.

The question is, does anyone have a better idea, or know of an already existing method to do this?

Share Improve this question asked Dec 13, 2019 at 10:12 Gudlaugur EgilssonGudlaugur Egilsson 2,4602 gold badges25 silver badges24 bronze badges 4
  • 2 Approach I used is publish local dependancies to local npm server (verdaccio) and create Dockerfile in each package need to to be built and run docker build using -f option and install each using local npm server. – user2473015 Commented Jan 1, 2020 at 3:33
  • 1 That is an option I have been considering. Are you happy with that approach complexity and speed wise? – Gudlaugur Egilsson Commented Jan 2, 2020 at 9:19
  • 1 We use this method: stackoverflow.com/questions/56294568/… – Felix Commented Jan 3, 2020 at 10:38
  • 1 since i only need to dockerize a couple packages (and am not using yarn), i've been "tar chf ." to slurp up node_modules (dereferencing symlinks w/ the 'h' arg) and ADDing the tarball to Docker. Its ugly and slow, but easy. – jamey graham Commented Oct 28, 2021 at 19:47
Add a comment  | 

3 Answers 3

Reset to default 6

For my own project, the solution is to use docker BuildKit to first build all the workspace and then build a docker image for the project workspace reusing the previous built files.

In details you have copy in the docker file the top package.json with yarn lock and then cherrypicking the package.json of the needed workspace. Then running a yarn install and a yarn build to get everything working.

Here is my project:

# base image
FROM @myscope/base:latest as base

# set working directory
WORKDIR /app

# add `/app/node_modules/.bin` to $PATH
ENV PATH /app/node_modules/.bin:$PATH

# install and cache app dependencies
COPY ["package.json","yarn.lock", "./"]
COPY ./packages/server/package.json ./packages/server/
COPY ./packages/shared/package.json ./packages/shared/
COPY ./packages/app/package.json ./packages/app/

RUN yarn install --frozen-lockfile --non-interactive --production=false --ignore-scripts
COPY . /app
RUN yarn build



FROM nodejs:14.15 as serverapp

WORKDIR /app

COPY ["package.json","yarn.lock", "./"]
COPY ./packages/server/package.json ./packages/server/
COPY ./packages/shared/package.json ./packages/shared/

RUN yarn install --frozen-lockfile --non-interactive --production=true --ignore-scripts

# copy artifact build from the 'build environment'
COPY --from=base /app/packages/shared/dist /app/packages/shared/dist

COPY ["./packages/server/", "./packages/server/"]

WORKDIR /app/packages/server
VOLUME ["/app/packages/server/logs", "/app/packages/server/uploads"]
EXPOSE $PORT
CMD ["yarn", "start"]

shared is a private workspace that is a dependency of the server workspace.

Due no answer was satisfying for me i have built an npm package, which generates a Dockerfile for lerna projects. It uses the stage feature of Docker and creates stages for each package.

Current Version:

run thr following command to let lerna-dockerize setup all required configuration.

npx lerna-dockerize init

Original Version:

You simply need at least two Dockerfiles:

The base Dockerfile which contains the global setup

Dockerfile.base:

FROM node:14 as base
COPY ./package.json ./
RUN npm i
COPY ./lerna.json ./

and a template for the packages

Dockerfile.template:

FROM base as build
COPY ./package.json ./
RUN npm install
RUN --if-exists npm run build

you can also have custom Dockerfiles for packages by simply adding your own Dockerfile inside the package. This will replace the template by the custom Dockerfile.

afterwards you can run the command over npx to generate your dockerfile:

npx lerna-dockerize

This is an excellent question. For me, the whole point of using yarn workspaces was to get rid of a private npm registry.

My approach is using webpack in conjunction with webpack-node-externals and generate-package-json-webpack-plugin, see npmjs.com/package/generate-package-json-webpack-plugin.

With node externals, we can bundle all the dependencies from our other workspaces (libs) into the app (this makes a private npm registry obsolete). With the generate package json plugin, a new package json is created containing all dependencies except our workspace dependencies. With this package json next to the bundle, we can do npm or yarn install in the dockerfile.

Setting up the webpack config took a few hours but I think it scales very well and keeps the dockerfile clean.

本文标签: javascriptHow to build docker images in a lerna monorepo without publishingStack Overflow