One day, our lead Cloud Architect brought an Alexa-enabled Amazon Echo to the office. Nothing weird about that: He's known for being the #1 fan of new gadgets and all things “technology”. Paying tribute to our ancient office tradition, we gathered around to see the new gadget he spent money on. After asking the Echo a multitude of different silly questions to see the result, one of our engineers decided to have a look at the API to see what's inside. After basic exploration, we discovered it is possible to add custom Skills (Amazon’s extension for Echo) with almost unlimited possibilities. Unlike its competitors, Amazon Echo is an "open book" with tons of blank pages to write on. The challenge was set right away: how to use this flexible technology to extend the things we use in the workplace every day?
They say innovation happens when two ideas are put together and give birth to a brand new one.
After a recent Facebook Messenger update that enables developers and businesses to build bots, people realized the man/machine interface will never be the same. Even though this technology might be related to Machine Learning with a pinch of Artificial intelligence, in most cases, it’s a simple API call with 10 predefined answers with 3 conditions. To put it in simple words, an ordering pizza chatbot doesn’t call pizza delivery by itself – It just triggers an API similar to a mobile app and your order is going to their ordering system. Today, many teams are using HipChat not only to keep up-to-date on project status, get notified when a website is down and continue to deploy the previous version but also for setting up ChatOps.
The Smart Home voice control device Amazon Echo was the first to deliver a voice control device with an open API to add custom integrations that might be shared with others.
Doesn’t a chatbot with a voice interface sound fun? We thought the same. While HipChat is not a new tool for fostering the idea of integrating features within developers’ community, Echo and HipChat fit together like two puzzle pieces, so after a day of work, we had our first version of a talking bot.
From the very beginning, we decided to connect our custom bots to Echo and test how they work. Development was completed in 3 days. At their core, all chatbots are really simple: a standalone application connected to the chat on one end and API services on the other. We added another point of integration to our application – a controller similar to chat. Over the course of one week, Jenkins counted more than 1000 builds in the demo environment, while we counted the people who joined us to test this exciting feature and wanted the same for their custom set of bots.
Initial design of POC
So why did we need a chat and not directly connect all services to Echo? More and more teams are using chats to coordinate their team work applying the ChatOps methodology to group all the important notification in one place. Integrating all these options directly into Echo is a very time-consuming and non-scalable solution; at the same time platforms like HipChat already have integrations that may match the preferences of any team.
Sounds like a perfect plan, but we faced a couple of restrictions:
- Bots cannot trigger other integrations themselves
- It’s impossible to add custom voice commands to Echo without certification
Even though we can’t do anything with other integrations (bots), we are able to read their statuses and can come up with universal commands that allow users to get only important information and eliminate chat noise (laughter, small talk and other non-essential information). That’s why we decided to build a proper voice-over tool that read only important messages from users and other integrations. This will not only let those who use Jenkins, but almost any user with Echo and HipChat customize the app by selecting boxes in configuration tab of HipChat marketplace extension.
To show you how Alexa works, here’s a scenario: Imagine you are a development manager that works with a remote, geographically dispersed team. You set up a chat room where you run ChatOps with all the important integrations like Bamboo, Stand-Up Bot, Bitbucket /Stash, JIRA and a lot of other cool features needed for a proper workflow. Before you get down to work the next day, you check what was done overnight by different team members. So instead of opening a laptop or mobile phone and reading 10 pages of the chat you just ask Echo: “Alexa, ask HipChat for notification feed” and it will share all messages from other integrations that you selected in the settings: broken builds, new tickets in the project or commits to repository – whatever you want to hear. Then you can ask Echo for the user feed and it will read messages from tech leads or other peer managers. So ultimately, you will get a daily briefing in 30 seconds. We don’t claim this will completely substitute your daily status routine, but at least you can understand if there were any urgent issues that need your attention right away. Plus, you still may enjoy your morning coffee with no rush.
Analyzing market needs and users demands leads to adding as many features as needed to make daily life easier. For example, we added a possibility to send quick-reply messages like “Running late” or “Be there in 10 minutes”. Our team is excited to share this extension and code as an open source project to the Atlassian marketplace.
Since this idea originated from our common passion for technology, a number of its potential users would be limited to people who use HipChat and Echo.
However, not to keep this pearl to ourselves only, our team will be heading to Atlassian Summit where we will present the “VoiceMyBot” initiative since this is a hub of active Atlassian products users. We look forward to getting essential feedback and set-up strategy for future releases and how to make this project more impactful for the community as a whole.
Want to see VoiceMyBot in action? See our case study for more.