How Lambda Labs Gave Me Real-World Experience

Caleb McKay
3 min readJan 7, 2021

It all started 6 months ago. I was sitting in my room and saw an ad for a revolutionary new school. The Lambda School. It’s goal was to give students a new way to learn in the middle of an ever-changing world.

For me, I was lucky. I had many people within my family tree that were in the tech field, and I saw first-hand how being in this field could literally change their career path, and their life: So I went for it. I signed an ISA and was approved the next day, eager to go down an unfamiliar path for myself and turn over a new leaf.

Fast forward to today, and I’m a completely different person with a whole new set of skills. Throughout my journey, I learned statistics, different tools for analyzing data, and I even learned how to create neural networks for advanced predictive modeling.

Currently I’m near the end of the Lambda experience in this thing called Lambda Labs. It’s somewhat of a month long internship where you get a chance to work side by side with other developers working progressively on a real-life project that impacts people around the world.

For my project, I worked with an organization called “Story Squad”, where the vision was to help kids develop their creativity by getting off the phones and instead create stories with their imagination.

The problem that other data scientists and myself were working to solve, was learning how to use an engine or API that will automatically recognize handwritten text and transcribe it.

The Challenge Begins

The first week of Labs was the planning stage. It wasn’t too difficult: we asked questions we needed answers to, we broke the ice with teammates, and we planned out what exactly we were going to do for the next few weeks in terms of progress. But with the data we had, it wasn’t so simple.

The main challenge was using an unfamiliar text recognition engine called “Tesseract”. We spent a couple weeks working on it and ultimately chose to stick with Google Vision API and deployed it on Elastic Beanstalk.

The architecture was already somewhat constructed when we were initially handed the reins, which gave us the confidence to switch over to a more economical text transcriber — Tesseract. The main problem with this model was sorting through the thousands of submissions in order to train the model.

When attempting to use Tesseract we continually ran into the challenge of the model not being able to correctly sort through the dataset. We had this situation for about two weeks until we decided to just move forward on the Google Vision API instead.

Next after getting the google vision api deployed locally and working out the details there, we then deployed the app to AWS Elastic Beanstalk. It was very tricky at first because Elastic Beanstalk was new to us, but we were ultimately able to deploy it there and ship the endpoints over the backend guys.

The End In Mind

With Labs coming to a close, we have somewhat of a finished product. The web team absolutely crushed it this month with all the features they’ve included and our data science team went above and beyond investing countless hours learning and applying new methods to building machine learning models that wasn’t previously covered in the core curriculum.

All in all, it was a great experience with a great team, and I wish all my teammates and everyone within the Lambda community to reach success, however they define it.

--

--