Use the path. resolve() method to get an absolute path of a file from a relative path in Node. js, e.g. path. resolve('./some-file.
In NodeJS, require() is a built-in function to include external modules that exist in separate files. require() statement basically reads a JavaScript file, executes it, and then proceeds to return the export object.
An absolute path is defined as specifying the location of a file or directory from the root directory(/). In other words,we can say that an absolute path is a complete path from start of actual file system from / directory. Relative path is defined as the path related to the present working directly(pwd).
To find the full absolute path of the current directory use the pwd command. It is a best practice to use relative file paths (if possible). When using relative file paths, your web pages will not be bound to your current base URL.
And what about:
var myModule = require.main.require('./path/to/module');
It requires the file as if it were required from the main js file, so it works pretty well as long as your main js file is at the root of your project... and that's something I appreciate.
There's a really interesting section in the Browserify Handbook:
avoiding ../../../../../../..
Not everything in an application properly belongs on the public npm and the overhead of setting up a private npm or git repo is still rather large in many cases. Here are some approaches for avoiding the
../../../../../../../
relative paths problem.node_modules
People sometimes object to putting application-specific modules into node_modules because it is not obvious how to check in your internal modules without also checking in third-party modules from npm.
The answer is quite simple! If you have a
.gitignore
file that ignoresnode_modules
:node_modules
You can just add an exception with
!
for each of your internal application modules:node_modules/* !node_modules/foo !node_modules/bar
Please note that you can't unignore a subdirectory, if the parent is already ignored. So instead of ignoring
node_modules
, you have to ignore every directory insidenode_modules
with thenode_modules/*
trick, and then you can add your exceptions.Now anywhere in your application you will be able to
require('foo')
orrequire('bar')
without having a very large and fragile relative path.If you have a lot of modules and want to keep them more separate from the third-party modules installed by npm, you can just put them all under a directory in
node_modules
such asnode_modules/app
:node_modules/app/foo node_modules/app/bar
Now you will be able to
require('app/foo')
orrequire('app/bar')
from anywhere in your application.In your
.gitignore
, just add an exception fornode_modules/app
:node_modules/* !node_modules/app
If your application had transforms configured in package.json, you'll need to create a separate package.json with its own transform field in your
node_modules/foo
ornode_modules/app/foo
component directory because transforms don't apply across module boundaries. This will make your modules more robust against configuration changes in your application and it will be easier to independently reuse the packages outside of your application.symlink
Another handy trick if you are working on an application where you can make symlinks and don't need to support windows is to symlink a
lib/
orapp/
folder intonode_modules
. From the project root, do:ln -s ../lib node_modules/app
and now from anywhere in your project you'll be able to require files in
lib/
by doingrequire('app/foo.js')
to getlib/foo.js
.custom paths
You might see some places talk about using the
$NODE_PATH
environment variable oropts.paths
to add directories for node and browserify to look in to find modules.Unlike most other platforms, using a shell-style array of path directories with
$NODE_PATH
is not as favorable in node compared to making effective use of thenode_modules
directory.This is because your application is more tightly coupled to a runtime environment configuration so there are more moving parts and your application will only work when your environment is setup correctly.
node and browserify both support but discourage the use of
$NODE_PATH
.
I like to make a new node_modules
folder for shared code, then let node and require do what it does best.
for example:
- node_modules // => these are loaded from your package.json
- app
- node_modules // => add node-style modules
- helper.js
- models
- user
- car
- package.json
- .gitignore
For example, if you're in car/index.js
you can require('helper')
and node will find it!
node has a clever algorithm for resolving modules that is unique among rival platforms.
If you require('./foo.js')
from /beep/boop/bar.js
, node will look for ./foo.js
in /beep/boop/foo.js
. Paths that start with a ./
or ../
are always local to the file that calls require()
.
If however you require a non-relative name such as require('xyz')
from /beep/boop/foo.js
, node searches these paths in order, stopping at the first match and raising an error if nothing is found:
/beep/boop/node_modules/xyz
/beep/node_modules/xyz
/node_modules/xyz
For each xyz
directory that exists, node will first look for a xyz/package.json
to see if a "main"
field exists. The "main"
field defines which file should take charge if you require()
the directory path.
For example, if /beep/node_modules/xyz
is the first match and /beep/node_modules/xyz/package.json
has:
{
"name": "xyz",
"version": "1.2.3",
"main": "lib/abc.js"
}
then the exports from /beep/node_modules/xyz/lib/abc.js
will be returned by
require('xyz')
.
If there is no package.json
or no "main"
field, index.js
is assumed:
/beep/node_modules/xyz/index.js
It seems "really bad" but give it time. It is, in fact, really good. The explicit require()
s give a total transparency and ease of understanding that is like a breath of fresh air during a project life cycle.
Think of it this way: You are reading an example, dipping your toes into Node.js and you've decided it is "really bad IMO." You are second-guessing leaders of the Node.js community, people who have logged more hours writing and maintaining Node.js applications than anyone. What is the chance the author made such a rookie mistake? (And I agree, from my Ruby and Python background, it seems at first like a disaster.)
There is a lot of hype and counter-hype surrounding Node.js. But when the dust settles, we will acknowledge that explicit modules and "local first" packages were a major driver of adoption.
Of course, node_modules
from the current directory, then the parent, then grandparent, great-grandparent, etc. is searched. So packages you have installed already work this way. Usually you can require("express")
from anywhere in your project and it works fine.
If you find yourself loading common files from the root of your project (perhaps because they are common utility functions), then that is a big clue that it's time to make a package. Packages are very simple: move your files into node_modules/
and put a package.json
there. Voila! Everything in that namespace is accessible from your entire project. Packages are the correct way to get your code into a global namespace.
I personally don't use these techniques, but they do answer your question, and of course you know your own situation better than I.
You can set $NODE_PATH
to your project root. That directory will be searched when you require()
.
Next, you could compromise and require a common, local file from all your examples. That common file simply re-exports the true file in the grandparent directory.
examples/downloads/app.js (and many others like it)
var express = require('./express')
examples/downloads/express.js
module.exports = require('../../')
Now when you relocate those files, the worst-case is fixing the one shim module.
Have a look at node-rfr.
It's as simple as this:
var rfr = require('rfr');
var myModule = rfr('projectSubDir/myModule');
If you are using yarn instead of npm you can use workspaces.
Let's say I have a folder services
I wish to require more easily:
.
├── app.js
├── node_modules
├── test
├── services
│ ├── foo
│ └── bar
└── package.json
To create a Yarn workspace, create a package.json
file inside the services folder
:
{
"name": "myservices",
"version": "1.0.0"
}
In your main package.json add:
"private": true,
"workspaces": ["myservices"]
Run yarn install
from the root of the project.
Then, anywhere in your code, you can do:
const { myFunc } = require('myservices/foo')
instead of something like:
const { myFunc } = require('../../../../../../services/foo')
I use process.cwd()
in my projects. For example:
var Foo = require(process.cwd() + '/common/foo.js');
It might be worth noting that this will result in require
ing an absolute path, though I have yet to run into issues with this.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With