The following error was output when executing codecept -c src run acceptance
command on alpine linux:
none base64: unrecognized option: w BusyBox v1.30.1 (2019-06-12 17:51:55 UTC) multi-call binary.
Usage: base64 [-d] [FILE]
Base64 encode or decode FILE to standard output -d Decode data
I can't see the command being executed by codecept.
I tried to add base64 in apk but base64 did not exist.
Dockerfile:
FROM node:10-alpine AS node
FROM php:7.1-fpm-alpine
ENV GITHUB_OAUTH_TOKEN test
ENV COMPOSER_ASSET_VERSION 1.3.1
ENV DOCKERIZE_VERSION v0.6.1
ENV PATH=~/.composer/vendor/bin:$PATH
# install packages
RUN apk add -U --no-cache \
curl-dev \
libxml2-dev \
libpng-dev \
libjpeg-turbo-dev \
zip \
libzip-dev \
unzip \
gmp-dev \
python \
make \
autoconf \
memcached-dev \
libmemcached-dev \
libmcrypt-dev \
icu-dev \
g++
RUN pecl install xdebug \
memcached
# install PHP extensions
RUN docker-php-source extract \
&& cp /usr/src/php/ext/openssl/config0.m4 /usr/src/php/ext/openssl/config.m4
RUN docker-php-ext-configure gd --with-png-dir=/usr/include --with-jpeg-dir=/usr/include \
&& docker-php-ext-configure soap --enable-soap
RUN docker-php-ext-install \
pdo \
pdo_mysql \
mysqli \
mbstring \
mcrypt \
xml \
intl \
opcache \
gd \
soap \
zip \
&& docker-php-ext-enable xdebug \
memcached
# install composer
RUN curl -sS https://getcomposer.org/installer | php \
&& mv composer.phar /usr/local/bin/composer
# install composer plugin
RUN composer global require hirak/prestissimo \
&& composer config --global github-oauth.github.com $GITHUB_OAUTH_TOKEN \
&& composer config -g repos.packagist composer https://packagist.jp \
&& composer global require fxp/composer-asset-plugin:^$COMPOSER_ASSET_VERSION
# install dockerize
#RUN wget https://github.com/jwilder/dockerize/releases/download/$DOCKERIZE_VERSION/dockerize-alpine-linux-amd64-$DOCKERIZE_VERSION.tar.gz \
# && tar -C /usr/local/bin -xzvf dockerize-alpine-linux-amd64-$DOCKERIZE_VERSION.tar.gz \
#&& rm dockerize-alpine-linux-amd64-$DOCKERIZE_VERSION.tar.gz
# add node.js npm
COPY --from=node /usr/local /usr/local
RUN mkdir -p /project/test
WORKDIR /project/test
CMD ["php-fpm"]
#RUN rm /usr/local/bin/yarn /usr/local/bin/yarnpkg
The alpine
images seem to have a version of base64
which doesn't provide the -w
option:
docker container run -it --rm alpine:3.9 base64 --help
BusyBox v1.29.3 (2019-01-24 07:45:07 UTC) multi-call binary.
Usage: base64 [-d] [FILE]
Base64 encode or decode FILE to standard output
-d Decode data
but if you execute apk add --update coreutils
it's there:
docker container run -it --rm alpine:3.9
/ # apk add --update coreutils
fetch http://dl-cdn.alpinelinux.org/alpine/v3.9/main/x86_64/APKINDEX.tar.gz
fetch http://dl-cdn.alpinelinux.org/alpine/v3.9/community/x86_64/APKINDEX.tar.gz
(1/3) Installing libattr (2.4.47-r7)
(2/3) Installing libacl (2.2.52-r5)
(3/3) Installing coreutils (8.30-r0)
Executing busybox-1.29.3-r10.trigger
OK: 7 MiB in 17 packages
/ # base64 --help
Usage: base64 [OPTION]... [FILE]
Base64 encode or decode FILE, or standard input, to standard output.
With no FILE, or when FILE is -, read standard input.
Mandatory arguments to long options are mandatory for short options too.
-d, --decode decode data
-i, --ignore-garbage when decoding, ignore non-alphabet characters
-w, --wrap=COLS wrap encoded lines after COLS character (default 76).
Use 0 to disable line wrapping
--help display this help and exit
--version output version information and exit
The data are encoded as described for the base64 alphabet in RFC 4648.
When decoding, the input may contain newlines in addition to the bytes of
the formal base64 alphabet. Use --ignore-garbage to attempt to recover
from any other non-alphabet bytes in the encoded stream.
GNU coreutils online help: <https://www.gnu.org/software/coreutils/>
Report base64 translation bugs to <https://translationproject.org/team/>
Full documentation at: <https://www.gnu.org/software/coreutils/base64>
or available locally via: info '(coreutils) base64 invocation'
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With