We've built an MS Bot Framework bot that consumes our existing, internal, on-premises APIs during conversations. We'd like to release this bot by dropping a Web Chat Component into the DOM of our existing, internally-facing, on-premises application.
With our existing architecture, naturally, we want to host this bot internally too--to leverage all our existing configuration and deployment processes. We understand that, regardless, the bot will have to communicate with LUIS--which is fine by us; it doesn't require the more complex (larger attack surface, less central IT buy-in) setup of Azure connecting directly to our internal business data API.
I think this diagram makes it more clear:
EDIT 1: Can we also host the direct line or a similar connector on-premises without having to write a custom connector? Additionally, can we chat with our bot over such a connector without having to write a custom chat component/widget for the DOM? (The web chat component would work just fine as long as it's pointed at our channel.)
The end goal here is to get all of our chat traffic to stay on-premises because this is a data-driven chatbot serving sensitive numbers. It will take less time to redevelop this in another framework that can run wholly on-premises than get approval from our central IT.
Side Note: I'm aware of the Azure Stack Preview. The minimum hardware requirements (and probably subscription costs too) are extreme overkill. (We're talking about a single Node app, after all.)
This is not a duplicate of this question because this question also addresses the key element of direct/line connector on-prem hosting where the other question assumed that the connector would still run on Azure.
You can host this on AWS, GCP, Heroku, etc. You can even host this as a function - using AWS Lambda functions or Azure functions. In most of the Microsoft samples, this is hosted on the Azure PaaS platform, called App Service.
You'll need to create and configure four different resources for the bot in Azure: an Azure Bot, a Key Vault, a Web App, and an App Service Plan.
The Bot Framework SDK v4. 14 is an Open Source SDK that enables developers to model and build sophisticated conversations using C#, Java, JavaScript, or Python.
First of all any chatbot is going to be the program that runs along with the NLP, Its the NLP that brings the knowledge to the chatbot. NLP lies on the hands of the Machine learning techniques.
There are few reasons why the on premise chatbots are less.
But using the cloud based NLP may not provide the data privacy and security and also the flexibility of including my business logic is very less.All together going to the on premise or on cloud is based on the needs and the use case of the requirements. How ever please refer this link for end to end knowledge on building the chatbot on premise with very few steps and easily and fully customisable and with all open stack frameworks and tools (Botkit, RASA etc).
This also explains how to host the BOT framework on premise.
Complete On-Premise and Fully Customisable Chat Bot - Part 1 - Overview (https://creospiders.blogspot.com/2018/03/complete-on-premise-and-fully.html) Complete On-Premise and Fully Customisable Chat Bot - Part 2 - Agent Building Using Botkit (https://creospiders.blogspot.com/2018/03/complete-on-premise-and-fully_16.html) Complete On-Premise and Fully Customisable Chat Bot - Part 3 - Communicating to the Agent that has been built (https://creospiders.blogspot.com/2018/04/CompleteOn-PremiseandFullyCustomisableChatBotpart3.html) Complete On-Premise and Fully Customisable Chat Bot - Part 4 - Integrating the Natural Language Processor NLP (https://creospiders.blogspot.com/2018/07/complete-on-premise-and-fully.html)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With