My Experience at Hack The North 2020++

Abe Arafat
3 min readJan 24, 2021

Last weekend at Hack The North, I created wellness.ai, a supportive pocket companion.

What is Hack The North anyways?

Hack The North is Canada’s largest hackathon. It is a fully student-run operation, supported by Waterloo Engineering. This year’s event was online, and over 3000 high school and university students participated. Teams of up to 4 members had 36 hours to create an innovative product, often while using a technology or API provided by the sponsors.

What sets Hack The North apart from so many other hackathons (besides its massive scale) is the infrastructure they have set up to help us hackers design our products. Leading up to the weekend, they hosted dozens of technical workshops and panels with field experts and recent graduates, all of which were recorded and are available on their YouTube page. During the hackathon, mentors were available to help teams work out bugs or get over big obstacles. This is a real game changer as a single problem can often stall an implementation for hours, wasting precious time.

The whole experience culminates in a presentation to set a judges, where teams present their design process and demo their product.

What is wellness.ai?

Wellness.ai is a voice-activated assistant designed to help you maintain your mental health during these especially tough times. You can talk to wellness.ai using any Google Assistant or Alexa-enabled device. Whether its just one sentence or an entire soliloquy, wellness.ai interprets the emotion in your message using artificial intelligence. If the voice app detects that you have been feeling negative emotions for many consecutive days, it will suggest an activity proven to improve your mood, such as going for a walk or calling a friend.

How did we do it?

To build wellness.ai, my teammate and I primarily used 2 technologies:

  1. IBM Watson: We used the tone analyzer API offered by IBM Watson to figure out the emotion in the user’s verbal journal entry. This works using a neural network trained to recognize tone in text on the IBM Cloud.
  2. VoiceFlow: VoiceFlow is a software that makes designing voice-enabled applications super easy. The key features we used were the API block, the logic tools, custom JavaScript (JS) and the integrated Google Sheets API. We used the API block to send IBM Watson the message that the user speaks and retrieve a response. We wrote some JS to a) correctly format the user’s speech so that it could be easily understood by the Watson API and b) to parse the .json file returned and extract the relevant data. The integrated Google Sheets API allowed us to have a database where we stored the relevant data from each API call. The logic units were used to check if the last three entries were negative emotions, and if so it would randomly suggest a mood-boosting activity.

Key takeaways:

  • Can’t wait for the next Hack The North!
  • Making a voice app doesn’t have to be hard.
  • I gained a lot of experience working with APIs.
  • Demoing our LIVE app to the judges was the most satisfying part of the whole experience!
  • If you’d like to try wellness.ai out for yourself, shoot me an email @ abearafat1@gmail.com and I will add you to our alpha tester list!
  • A more in-depth description of our project, along with a demo video, can be found on Devpost. Check it out and give it a like :)

--

--

Abe Arafat
Abe Arafat

Written by Abe Arafat

Software Engineering @ McGill University

Responses (1)