I would like to create multiple Cloud Functions for Firebase and deploy them all at the same time from one project. I would also like to separate each function into a separate file. Currently I can create multiple functions if I put them both in index.js such as:
exports.foo = functions.database.ref('/foo').onWrite(event => {
...
});
exports.bar = functions.database.ref('/bar').onWrite(event => {
...
});
However I would like to put foo and bar in separate files. I tried this:
/functions
|--index.js (blank)
|--foo.js
|--bar.js
|--package.json
where foo.js is
exports.foo = functions.database.ref('/foo').onWrite(event => {
...
});
and bar.js is
exports.bar = functions.database.ref('/bar').onWrite(event => {
...
});
Is there a way to accomplish this without putting all functions in index.js?
Our solution for handling Firebase Cloud Functions automatically name all functions from their location. save your functions in a folder structure that is like your realtime-database structure. name your files according to the trigger you use ( onWrite, onCreate, onUpdate , onChange and onDelete)
By default each Cloud Run container instance can receive up to 80 requests at the same time; you can increase this to a maximum of 1000. Although you should use the default value, if needed you can lower the maximum concurrency.
Ah, Cloud Functions for Firebase load node modules normally, so this works
structure:
/functions
|--index.js
|--foo.js
|--bar.js
|--package.json
index.js:
const functions = require('firebase-functions');
const fooModule = require('./foo');
const barModule = require('./bar');
exports.foo = functions.database.ref('/foo').onWrite(fooModule.handler);
exports.bar = functions.database.ref('/bar').onWrite(barModule.handler);
foo.js:
exports.handler = (event) => {
...
};
bar.js:
exports.handler = (event) => {
...
};
The answer by @jasonsirota was very helpful. But it may be useful to see more detailed code, especially in the case of HTTP triggered functions.
Using the same structure as in @jasonsirota's answer, lets say you wish to have two separate HTTP trigger functions in two different files:
directory structure:
/functions
|--index.js
|--foo.js
|--bar.js
|--package.json
index.js:
'use strict';
const fooFunction = require('./foo');
const barFunction = require('./bar');
// Note do below initialization tasks in index.js and
// NOT in child functions:
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp(functions.config().firebase);
const database = admin.database();
// Pass database to child functions so they have access to it
exports.fooFunction = functions.https.onRequest((req, res) => {
fooFunction.handler(req, res, database);
});
exports.barFunction = functions.https.onRequest((req, res) => {
barFunction.handler(req, res, database);
});
foo.js:
exports.handler = function(req, res, database) {
// Use database to declare databaseRefs:
usersRef = database.ref('users');
...
res.send('foo ran successfully');
}
bar.js:
exports.handler = function(req, res, database) {
// Use database to declare databaseRefs:
usersRef = database.ref('users');
...
res.send('bar ran successfully');
}
Update: Typescript is now fully supported so no need for the shenanigans below. Just use the firebase cli
Here is how I personnally did it with typescript:
/functions
|--src
|--index.ts
|--http-functions.ts
|--main.js
|--db.ts
|--package.json
|--tsconfig.json
Let me preface this by giving two warnings to make this work:
For point number 2 I'm not sure why. Secundo you should respect my configuration of index, main and db exactly (at least to try it out).
index.ts : deals with export. I find it cleaner to let the index.ts deal with exports.
// main must be before functions
export * from './main';
export * from "./http-functions";
main.ts: Deals with initialization.
import { config } from 'firebase-functions';
import { initializeApp } from 'firebase-admin';
initializeApp(config().firebase);
export * from "firebase-functions";
db.ts: just reexporting the db so its name is shorter than database()
import { database } from "firebase-admin";
export const db = database();
http-functions.ts
// db must be imported like this
import { db } from './db';
// you can now import everything from index.
import { https } from './index';
// or (both work)
// import { https } from 'firebase-functions';
export let newComment = https.onRequest(createComment);
export async function createComment(req: any, res: any){
db.ref('comments').push(req.body.comment);
res.send(req.body.comment);
}
With Node 8 LTS now available with Cloud/Firebase Functions you can do the following with spread operators:
"engines": {
"node": "8"
},
const functions = require("firebase-functions");
const admin = require("firebase-admin");
admin.initializeApp();
module.exports = {
...require("./lib/foo.js"),
// ...require("./lib/bar.js") // add as many as you like
};
const functions = require("firebase-functions");
const admin = require("firebase-admin");
exports.fooHandler = functions.database
.ref("/food/{id}")
.onCreate((snap, context) => {
let id = context.params["id"];
return admin
.database()
.ref(`/bar/${id}`)
.set(true);
});
To be kept simple (but does the work), I have personally structured my code like this.
Layout
├── /src/
│ ├── index.ts
│ ├── foo.ts
│ ├── bar.ts
| ├── db.ts
└── package.json
foo.ts
import * as functions from 'firebase-functions';
export const fooFunction = functions.database()......... {
//do your function.
}
export const someOtherFunction = functions.database().......... {
// do the thing.
}
bar.ts
import * as functions from 'firebase-functions';
export const barFunction = functions.database()......... {
//do your function.
}
export const anotherFunction = functions.database().......... {
// do the thing.
}
db.ts
import * as admin from 'firebase-admin';
import * as functions from 'firebase-functions';
export const firestore = admin.firestore();
export const realtimeDb = admin.database();
index.ts
import * as admin from 'firebase-admin';
import * as functions from 'firebase-functions';
admin.initializeApp(functions.config().firebase);
// above codes only needed if you use firebase admin
export * from './foo';
export * from './bar';
Works for directories of any nested levels. Just follow the pattern inside the directories too.
credit to @zaidfazil answer
bigcodenerd.org outline's a simpler architecture pattern in order to have methods separated into different files and exported in one line within the index.js file.
The architecture for the project in this sample is the following:
projectDirectory
index.js
const admin = require('firebase-admin');
const podcast = require('./podcast');
const profile = require('./profile');
admin.initializeApp();
exports.getPodcast = podcast.getPodcast();
exports.removeProfile = profile.removeProfile();
podcast.js
const functions = require('firebase-functions');
exports.getPodcast = () => functions.https.onCall(async (data, context) => {
...
return { ... }
});
The same pattern would be used for the removeProfile
method in the profile file.
In case with Babel/Flow it would look like this:
.
├── /build/ # Compiled output for Node.js 6.x
├── /src/ # Application source files
│ ├── db.js # Cloud SQL client for Postgres
│ ├── index.js # Main export(s)
│ ├── someFuncA.js # Function A
│ ├── someFuncA.test.js # Function A unit tests
│ ├── someFuncB.js # Function B
│ ├── someFuncB.test.js # Function B unit tests
│ └── store.js # Firebase Firestore client
├── .babelrc # Babel configuration
├── firebase.json # Firebase configuration
└── package.json # List of project dependencies and NPM scripts
src/index.js
- Main export(s)export * from './someFuncA.js';
export * from './someFuncB.js';
src/db.js
- Cloud SQL Client for Postgresimport { Pool } from 'pg';
import { config } from 'firebase-functions';
export default new Pool({
max: 1,
user: '<username>',
database: '<database>',
password: config().db.password,
host: `/cloudsql/${process.env.GCP_PROJECT}:<region>:<instance>`,
});
src/store.js
- Firebase Firestore Clientimport firebase from 'firebase-admin';
import { config } from 'firebase-functions';
firebase.initializeApp(config().firebase);
export default firebase.firestore();
src/someFuncA.js
- Function Aimport { https } from 'firebase-functions';
import db from './db';
export const someFuncA = https.onRequest(async (req, res) => {
const { rows: regions } = await db.query(`
SELECT * FROM regions WHERE country_code = $1
`, ['US']);
res.send(regions);
});
src/someFuncB.js
- Function Bimport { https } from 'firebase-functions';
import store from './store';
export const someFuncB = https.onRequest(async (req, res) => {
const { docs: regions } = await store
.collection('regions')
.where('countryCode', '==', 'US')
.get();
res.send(regions);
});
.babelrc
{
"presets": [["env", { "targets": { "node": "6.11" } }]],
}
firebase.json
{
"functions": {
"source": ".",
"ignore": [
"**/node_modules/**"
]
}
}
package.json
{
"name": "functions",
"verson": "0.0.0",
"private": true,
"main": "build/index.js",
"dependencies": {
"firebase-admin": "^5.9.0",
"firebase-functions": "^0.8.1",
"pg": "^7.4.1"
},
"devDependencies": {
"babel-cli": "^6.26.0",
"babel-core": "^6.26.0",
"babel-jest": "^22.2.2",
"babel-preset-env": "^1.6.1",
"jest": "^22.2.2"
},
"scripts": {
"test": "jest --env=node",
"predeploy": "rm -rf ./build && babel --out-dir ./build src",
"deploy": "firebase deploy --only functions"
}
}
$ yarn install # Install project dependencies
$ yarn test # Run unit tests
$ yarn deploy # Deploy to Firebase
To be kept simple (but does the work), I have personally structured my code like this.
Layout
├── /src/
│ ├── index.ts
│ ├── foo.ts
│ ├── bar.ts
└── package.json
foo.ts
export const fooFunction = functions.database()......... {
//do your function.
}
export const someOtherFunction = functions.database().......... {
// do the thing.
}
bar.ts
export const barFunction = functions.database()......... {
//do your function.
}
export const anotherFunction = functions.database().......... {
// do the thing.
}
index.ts
import * as fooFunctions from './foo';
import * as barFunctions from './bar';
module.exports = {
...fooFunctions,
...barFunctions,
};
Works for directories of any nested levels. Just follow the pattern inside the directories too.
The Firebase docs have now been updated with a good guide to multi-file code organization:
Docs > Cloud Functions > Write functions > Organize functions
To summarize:
foo.js
const functions = require('firebase-functions');
exports.foo = functions.https.onRequest((request, response) => {
// ...
});
bar.js
const functions = require('firebase-functions');
exports.bar = functions.https.onRequest((request, response) => {
// ...
});
index.js
const foo = require('./foo');
const bar = require('./bar');
exports.foo = foo.foo;
exports.bar = bar.bar;
This format allows your entry-point to find additional function files, and export each function within each file, automatically.
Main Entry Point Script
Finds all .js files inside of the functions folder, and exports each function exported from each file.
const fs = require('fs');
const path = require('path');
// Folder where all your individual Cloud Functions files are located.
const FUNCTIONS_FOLDER = './scFunctions';
fs.readdirSync(path.resolve(__dirname, FUNCTIONS_FOLDER)).forEach(file => { // list files in the folder.
if(file.endsWith('.js')) {
const fileBaseName = file.slice(0, -3); // Remove the '.js' extension
const thisFunction = require(`${FUNCTIONS_FOLDER}/${fileBaseName}`);
for(var i in thisFunction) {
exports[i] = thisFunction[i];
}
}
});
Example Export of Multiple Functions from One File
const functions = require('firebase-functions');
const query = functions.https.onRequest((req, res) => {
let query = req.query.q;
res.send({
"You Searched For": query
});
});
const searchTest = functions.https.onRequest((req, res) => {
res.send({
"searchTest": "Hi There!"
});
});
module.exports = {
query,
searchTest
}
http accessible endpoints are appropriately named
✔ functions: query: http://localhost:5001/PROJECT-NAME/us-central1/query
✔ functions: helloWorlds: http://localhost:5001/PROJECT-NAME/us-central1/helloWorlds
✔ functions: searchTest: http://localhost:5001/PROJECT-NAME/us-central1/searchTest
One file
If you only have a few additional files (e.g. just one), you can use:
const your_functions = require('./path_to_your_functions');
for (var i in your_functions) {
exports[i] = your_functions[i];
}
So I have this project which has background functions and http functions. I also have tests for unit testing. CI/CD will make your life much easier when deploying cloud functions
|-- package.json
|-- cloudbuild.yaml
|-- functions
|-- index.js
|-- background
| |-- onCreate
| |-- index.js
|-- create.js
|
|-- http
| |-- stripe
| |-- index.js
| |-- payment.js
|-- utils
|-- firebaseHelpers.js
|-- test
|-- ...
|-- package.json
Note: utils/
folder is for share code between functions
Here you can just import all the functions you need and declare them. No need to have logic here. It makes it cleaner in my opinion.
require('module-alias/register');
const functions = require('firebase-functions');
const onCreate = require('@background/onCreate');
const onDelete = require('@background/onDelete');
const onUpdate = require('@background/onUpdate');
const tours = require('@http/tours');
const stripe = require('@http/stripe');
const docPath = 'tours/{tourId}';
module.exports.onCreate = functions.firestore.document(docPath).onCreate(onCreate);
module.exports.onDelete = functions.firestore.document(docPath).onDelete(onDelete);
module.exports.onUpdate = functions.firestore.document(docPath).onUpdate(onUpdate);
module.exports.tours = functions.https.onRequest(tours);
module.exports.stripe = functions.https.onRequest(stripe);
How about having continuos integration and deployment every time you push your changes to the repo? You can have it by using google google cloud build. It's free until certain point :) Check this link.
./cloudbuild.yaml
steps:
- name: "gcr.io/cloud-builders/npm"
args: ["run", "install:functions"]
- name: "gcr.io/cloud-builders/npm"
args: ["test"]
- name: "gcr.io/${PROJECT_ID}/firebase"
args:
[
"deploy",
"--only",
"functions",
"-P",
"${PROJECT_ID}",
"--token",
"${_FIREBASE_TOKEN}"
]
substitutions:
_FIREBASE_TOKEN: nothing
There is a pretty good way to organize all of your cloud functions for the long term. I did this recently and it is working flawlessly.
What I did was organize each cloud function in separate folders based on their trigger endpoint. Every cloud function filename ends with *.f.js
. For example, if you had onCreate
and onUpdate
triggers on user/{userId}/document/{documentId}
then create two files onCreate.f.js
and onUpdate.f.js
in directory functions/user/document/
and your function will be named userDocumentOnCreate
and userDocumentOnUpdate
respectively. (1)
Here is a sample directory stucture:
functions/
|----package.json
|----index.js
/----user/
|-------onCreate.f.js
|-------onWrite.f.js
/-------document/
|------------onCreate.f.js
|------------onUpdate.f.js
/----books/
|-------onCreate.f.js
|-------onUpdate.f.js
|-------onDelete.f.js
const functions = require('firebase-functions');
const admin = require('firebase-admin');
const db = admin.database();
const documentsOnCreate = functions.database
.ref('user/{userId}/document/{documentId}')
.onCreate((snap, context) => {
// your code goes here
});
exports = module.exports = documentsOnCreate;
const glob = require("glob");
const camelCase = require('camelcase');
const admin = require('firebase-admin');
const serviceAccount = require('./path/to/ServiceAccountKey.json');
try {
admin.initializeApp({ credential: admin.credential.cert(serviceAccount),
databaseURL: "Your database URL" });
} catch (e) {
console.log(e);
}
const files = glob.sync('./**/*.f.js', { cwd: __dirname });
for (let f = 0, fl = files.length; f < fl; f++) {
const file = files[f];
const functionName = camelCase(file.slice(0, -5).split('/'));
if (!process.env.FUNCTION_NAME || process.env.FUNCTION_NAME === functionName) {
exports[functionName] = require(file);
}
}
(1): You can use any name you want. To me, onCreate.f.js, onUpdate.f.js etc. seem more relevant to the kind of trigger they are.
Here's a simple answer if you're creating cloud functions with typescript.
/functions
|--index.ts
|--foo.ts
Near all your regular imports at the top just export all the functions from foo.ts
.
export * from './foo';
I am also in the jouney of finding the best folder structure for Cloud Functions, so I decided to share what I've come up with:
+ /src
| - index.ts
| + /events
| | - moduleA_events.ts
| | - moduleB_events.ts
| + /service
| | - moduleA_services.ts
| | - moduleB_services.ts
| + /model
| | - objectA.ts
| | - objectB.ts
| | - objectC.ts
/src/index.ts this file works as the entry point for all events (functions) available in your app, such as database events, https requests, scheduled functions. However, functions are not directly declared in index.js, but in the events folder indead. Code sample:
exports.user = require("./events/userEvents")
exports.order = require("./events/orderEvents")
exports.product = require("./events/productEvents")
Note: according to GCF official documentation, this approach will automatically rename all your functions to the "module-function" pattern. Example: if you have "userCreated" function inside userEvents.ts, firebase will rename this function to "user-userCreated"
/src/events this folder should only contain cloud functions declarations and should not handle business logic directly. For the actual business, you should call custom functions from your /service folder (which maps the same modules as in the events folder). Code sample for userEvents.ts:
exports.userCreated = functions.firestore.document("/users/{documentId}").onCreate(async (snapshot) => { userServices.sendWelcomeEmail() }
/src/service the actual busienss logic that will connect with other firebase services such as firestore, storage, auth. You can also import your /model layer here (typescript only).
/src/model the interfaces used in typescript to ensure strong typed functions and objects.
As you would notice, this approach is mainly based on MVC and OOP principles. There is a lot of good debates on whether we should go with functional programming in serverless environment instead. Since my backend background is Java & C#, the folder structure I presented here seems more natural to me, however, I would be very interested in knowing how different this folder structure would be when moving to a functional programming approach.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With