Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Feeding logstash from azure web app. How?

I have a web app hosted on the azure platform and an ELK stack hosted on a virtual machine also in azure (same subscription) and am struggling to find a way to ship the logs from the app to logstash.

A web app stores all its files on a storage only accessible via FTP which logstash does not have an input plugin for.

What do people use to ship logs to ELK from web apps? If it was running as a VM I would use NXlog but that's not possible for a Web app.

I also use Log4Net and tried a UDP forwarder which worked on my local ELK stack but not the azure hosted one despite me adding the public UDP endpoint.

like image 969
Sheff Avatar asked Nov 17 '15 10:11

Sheff


People also ask

How do you use elk in Azure?

Next stepsCreate an Ubuntu VM in an Azure resource group. Install Elasticsearch, Logstash, and Kibana on the VM. Send sample data to Elasticsearch from Logstash. Open ports and work with data in the Kibana console.

Is Logstash distributed?

The meaning of "distributed" here is the one of a distributed real time computing system (for instance Apache Storm). So, If I understand, logstash is not a distributed system.

What is elk stack in Azure?

The ELK Stack (Elasticsearch, Logstash & Kibana) offers Azure users with all the key ingredients required for monitoring their applications — Elasticsearch for scalable and centralized data storage, Logstash for aggregation and processing, Kibana for visualization and analysis, and Beats for collection of different ...

What is elastic logging?

Elastic Logging is a service that scales for you, automatically ingesting data of any source, size and format without the need for further configuration by the end user.


2 Answers

Currently i am using Serilog to push my application log messages (in batch) towards a Redis queue which in turn is read by Logstash to enrich them and push them into Elasticsearch. This results in a reliable distributed setup which does not lose any application logs unless the redis max queue length is exceeded. Added bonus; Serilog emits json so your logstash config can stay pretty simple. Example code can be found here: https://gist.github.com/crunchie84/bcd6f7a8168b345a53ff

like image 75
Mark van Straten Avatar answered Nov 09 '22 19:11

Mark van Straten


Azure now has a project on GitHub called azure-diagnostics-tools that contains LogStash plugins for reading from blob and table storage, among other things.

Presumably, you can just enable diagnostics logging to blob/table storage for your WebApp and use the LogStash plugin to get them into ElasticSearch.

Regarding how to get log4net files into blob storage - you can use log4net.Appender.TraceAppender to write to the Trace system, which will cause it to be collected into Blob/Table storage when the option is enabled in the WebApp.

There's also an alternative to FTP for local files - you can use the Kudu REST VFS API to access (and modify) files by HTTP. I found this to be significantly faster and more reliable than FTP.

like image 3
makhdumi Avatar answered Nov 09 '22 19:11

makhdumi