Node.js Alexa Task Issue
I'm currently coding a Node.js Alexa Task via AWS Lambda, and I have been trying to code a function that receives information from the OpenWeather API and parses it into a variable called weather
. The relevant code is as follows:
var request = require('request'); var weather = ""; function isBadWeather(location) { var endpoint = "http://api.openweathermap.org/data/2.5/weather?q=" + location + "&APPID=205283d9c9211b776d3580d5de5d6338"; var body = ""; request(endpoint, function (error, response, body) { if (!error && response.statusCode == 200) { body = JSON.parse(body); weather = body.weather[0].id; } }); } function testWeather() { setTimeout(function() { if (weather >= 200 && weather < 800) weather = true; else weather = false; console.log(weather); generateResponse(buildSpeechletResponse(weather, true), {}); }, 500); }
I ran this snippet countless times in Cloud9 and other IDEs, and it seems to be working flawlessly. However, when I zip it into a package and upload it to AWS Lambda, I get the following error:
{ "errorMessage": "Cannot find module '/var/task/index'", "errorType": "Error", "stackTrace": [ "Function.Module._load (module.js:276:25)", "Module.require (module.js:353:17)", "require (internal/module.js:12:17)" ] }
I installed module-js, request, and many other Node modules that should make this code run, but nothing seems to fix this issue. Here is my directory, just in case:
- planyr.zip - index.js - node_modules - package.json
Does anyone know what the issue could be?
Fixed it! My issue was that I tried to zip the file using my Mac's built-in compression function in Finder.
If you're a Mac user, like me, you should run the following script in terminal when you are in the root directory of your project (folder containing your index.js
, node_modules
, etc. files).
zip -r ../yourfilename.zip *
For Windows:
Compress-Archive -LiteralPath node_modules, index.js -DestinationPath yourfilename.zip
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With