What is the right way to log every published message and save it to my server db.
There are two options which I can think of:
What is the best practice concerning performance and cost?
We wrote an article that talks about the right way to log JSON Messages to a Private Database.
While many approaches exist. One is best. Using PubNub Functions. You will asynchronously save messages reliably to your Database. Using an OnAfter Publish Event. Your database needs to be accessible via a secured HTTPS endpoint.
PubNub does not index your messages using FTS Indexing; at time of writing. You may want Full Text Search Indexing using your database or use an API provider like https://www.algolia.com/ for full text searching.
Data is valuable. AI and ML allows you to create insight from your data using Tensorflow. You may want to run data analysis for the message content. Using EMR / Hadoop or other big data analysis software.
You will use PubNub Functions to save your message asynchronously into your database system easily by following these steps.
To get started is easy. Assuming you already have a stream of messages being published to a PubNub data channel. Follow these easy steps. Successfully you will create a realtime function that is triggered every Publish event.
*
.*
channel.// Request Handler
export default request => {
return save(request).then( () => request.ok() );
}
// Async Logging/Save of JSON Messages
function save( data, retry=3 ) {
const xhr = require('xhr');
const post = { method : "POST", body : request.message };
const url = "https://my.company.com/save"; // <-- CHANGE URL HERE
// save message asynchronously
return xhr.fetch( url, post ).then( serverResponse => {
// Save Success!
}).catch( err => {
// Retry
if (retry > 0) save( data, --retry );
});
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With