My Journey into the Voice First World

Every now and then, I get questions about opportunities in the Voice First space, and if they can make a career out of it. I thought it would be a good idea to detail my journey so far into the voice first world and the opportunities I came across, to give a broad answer to the questions I get.

Late 2017, after I shut down the company I started – I was keen on being an individual contributor and building stuff myself.

The Freelancing Phase

So I started freelancing again and went back to coding iOS apps on Swift. I had a couple of great clients who gave me some challenging work. Life was good.

One of the apps I built during this time was ‎Lightdogs: Unplug & Focus on the App Store, a focus timer app that helps people be productive by building their own pet library, every time you choose to not engage with your phone.

Working with a tiny but remotely distributed team, I loved the process of building iOS apps. While there is nothing to be loved about dealing with auto-layouts and constraints in XCode, what I totally loved was deep diving into the Timer APIs on iOS to make the app work. Cam, who runs Lightdogs, is a great guy to work with and pushed for a high quality bar on design. Out of all the apps I have worked on so far – if I had to pick a favorite, Lightdogs has to be the one.

While freelancing was great, I would often miss the camaraderie of a team. I would be holed up in a co-working cabin the entire day and not have someone to ask for help from, when stuck on an issue or share findings about a certain idea. Naturally I wanted to change that, and was looking for opportunities to do so.

My first Alexa Project

One of the clients I worked with during this time was Steve from Dabble Lab. I was working with him on a couple of mobile apps, when one fine day he asked me if I can help him with an issue in Alexa skill.

He was working on an Alexa skill that would integrate with your Xero Dashboard and answer questions like “What’s my sales for this month so far?” or “What’s my sales target for this month”. Always up for exploring new interfaces, I took on the work. I remember spending the next few days trying to get Account Linking with Xero. When I finally got it working, I was struck by how easy it was to build an Alexa Skill.

But what really got me hooked was the accessibility and easiness of using Voice as a medium for Computing. I could walk into a room and say Alexa, ask business reports how am I doing on my sales target this month – and I would get an answer. The traditional way would be to turn to the web / mobile app, do a few clicks and consume the information visually.

Skills and other things!

At Dabble Lab, Steve’s vision was (and I think still is), that digital assistants are the next paradigm of how people will interact with computers. And that represents a huge opportunity for developers to build solutions for these users.

The next few months, I had the opportunity to explore different digital assistant platforms like Twilio Autopilot, Google Actions etc full time at Dabble Lab. With our experiments growing, our team equally grew up and soon I was leading a small but highly energetic team working on all kinds of challenges with voice and digital assistants.

While there are a lot of cool projects we worked on, what I am thankful for, is making coding videos for Dabble Lab’s Youtube channel. Building for Alexa and other platforms bought us face to face with the lack of online resources for builders exploring this platform. With a bit of push from Steve, I got in front of the camera and started making videos on solving certain development issues, you would face while building.

I ended up making videos for Skill Templates, which was a quick code starter for Alexa Skills and an entire series on getting started with Google’s Dialogflow. This was a huge step-up for me in terms of learning, and for getting out of my comfort zone. I learnt how to make better videos (or still learning rather), and how to stream effectively. Preparing a video on a topic takes you down the rabbit hole of APIs, and my understanding of the entire platform has only improved because of all this research.

I have had instances, when people would meet me at my workshops and talk about that one specific video which helped them solve their issues. And for a builder, there is no greater satisfaction than seeing stuff you create solve problems for people.

While working for Dabble Lab, I had the opportunity to visit the Alexa Agency Partner Summit @ Bangalore. This was the first time, I had the chance to meet folks from the Amazon Alexa team and got a first hand experience of their thought process into how they see Alexa evolving in the next few years.

Interviewing with Amazon

When I came back, it was time for me to take a small hiatus and get married. I knew Arpita for well over 3 years, and we had talked about it for the past year with all our families and friends. I know a couple of lines won’t warrant the justice to the entire wedding saga, so I would hopefully write another blog post about the eventful wedding I had.

After my wedding, things proceeded as usual and we were settling in – when my now colleague, Sogan told me about an opening with the Alexa team for a Solutions Architect. Having never worked at a big corp before, I was a tad bit nervous and skeptical about joining one. But I decided to give it a shot to see how I would fare.

Finding myself on the other side of the interview table was a bit unnerving at first, but I soon got used to it. Amazon’s Interview process is well detailed, and made me comfortable with what to expect. I went through two phone screens and five on-site interviews, with different rounds grilling me on everything I knew. My initial apprehensions washed away, and I was now enjoying the interview process as I found them a way to talk about Alexa’s strategy with my prospective colleagues.

Having finished all the interview rounds, I was informed one fine day that I cleared the rounds. A tad bit relieved, the first person I talked about this was with Steve. When I shared the news with him, he was incredibly supportive and had an in-depth discussion about the entire voice computing space in general. I don’t think I have ever seen someone as calm and genuine as Steve, when it comes to running a company and I have nothing but high words of praises for him. If you are into the Voice space and looking to work on exciting projects, Dabble Lab is a place I would recommend wholeheartedly.

After a few more discussions with friends at Amazon and mentors, I decided to take up the offer. Due to a small misunderstanding with my recruiter, I signed up my offer letter from the island of Koh Rong Samloem, in Cambodia during my honeymoon – but that’s a story for another day.

It’s still Day One

One thing that struck me about Amazon is it’s self-serve and Day One culture. There are always new challenges to be solved for our customers and all our solutions start with keeping the customers in our mind.

In my role at Amazon, I get to meet developers who are building great voice experiences for users. I have also had the great opportunity to witness the launch of Hindi from close quarters. The launch of Hindi means now developers can target the next billion users who will have access to Alexa in Hindi.

But even after a few eventful months at Amazon, it feels like Day One every day I get to work. There are always new challenges to work on, and new stuff happening all the time.

Just the other day, I was explaining to an aunt how to send messages on Whatsapp. Frustrated by typing on the phone, she noticed the mic icon and asked me what that stood for. When I explained, it would basically transcribe anything you say – she tried it out and has been sending messages now entirely by her voice. The accessibility and easiness of using voice is why I believe in Voice as a new paradigm in interacting with Computers.

At Amazon, we believe we are at the cusp of a great revolution in Voice Computing, and there has never been a better time to get onto the bandwagon.

Switching off the lights

Earlier this year, Animesh and I, took the tough decision to stop QICE & Learnflow – the companies that both of us started 5 years back. I write this post as a sort of timeline, on how we went from starting up our company to winding it up a few months back. I also plan to write about the lessons learned from all this.

In 2011, As I reached the end semester of my college, I hardly had starting a company on my mind. My plan was either to get a job or fly away to the States to do my M.S. But after I had gotten a couple of job offers, I realised they didn’t challenge me enough to warrant giving any amount of years in my life to it.

I scored a decent score in the GRE exams, but not enough to make it to the top universities. And frankly by the time, the exam ended, I was just honestly glad I didn’t have to study any more. Somehow, I couldn’t see spending more time looking at books, and memorising them to spit it all out in the exam. It was becoming clear further studies were a no-go for me.

As I wondered over what to do with my life, I got a call from a friend who had worked for Microsoft. Microsoft had outsourced training to one of their partners, who were looking for trainers. The assignment was to train college students on C#, a platform I was far too familiar with. I immediately enlisted Animesh’s help and thus began our journey.

Somewhere in the middle, one of the companies that offered me a job called me up to inform me of the joining date. Waking up from an afternoon nap, I told them, that I won’t be joining. When pressed for a reason, I just said “I am sleeping.”

Over the next couple of years, we would go to roughly 40 colleges all over India to train students on a variety of topics. Ultimately, after a while, we got tired of shuttling between cities and infrequent assignments, that we set up our own centre in Nagpur.

We began training students in Nagpur and ran multiple batches introducing students to the wonderful world of programming. But somewhere down the line, training was getting a bit at us. I was beginning to understand that programming is not for everyone (and sometimes even wonder, if it is for me)

I met students who would just grasp everything on the first day, and go figure out more and come back with the entire program done. These kind of students hardly needed any training, except for an occasional push in the right direction. And then there would be students, who would join, because they heard Computer Programming was a lucrative career. The trouble was, they would sometimes have a tough time grasping the concepts. Even after 20-30 classes, there would be students fumbling with the basic GET / POST concepts. It was certainly hard to deal with such students. It might sound elitist to say this, but not everyone has aptitude for programming. (Note : Sometimes I falter so much with programming, I wonder if, even I have aptitude for programming.)

Honestly, we would have roughly 1-2 decent students in a batch of 20. This was not working out as well as we thought. So somewhere down the line, we shifted tracks to building software. We started with building websites, graduated to web apps, and ultimately mobile apps. We were now a web / mobile development agency. This is where the fun began.

We hired a few amazing people, trained them, and worked on some amazing projects. We learnt a whole lot about building web apps, new technologies and worked with some good clients. We had our fair deal of project mismanagement, overshooting budgets and timelines – the usual issues. If it was not for this time, I wouldn’t have learnt so much about software and people, in such a short time. Running a business was like putting to test all my skills at the same time. More than anything else, I understood a whole lot about myself – and that has been helpful.

After doing this for a good deal of 5 years, it reached a point, where due to multiple reasons – we were not able to continue what we were doing. When I had started, I had never thought of quitting. I had always thought this is what I would be doing forever. For me, dissociating with the company, was the toughest – but it had to be done, no matter what.

And so, after 5 years, we took the decision to wind up operations. We spent the next couple of months, helping people transition to their new jobs and clients with their project. Looking back, I have been incredibly lucky to have spent time of my life on a huge challenge, and with some amazing people by side. And in retrospect, this has been one of the best phase of my life.

Securing your WordPress Site – Wordcamp 2017

As a developer, I have in the past spoken in front of College Students and undertook Corporate training programs – but my first talk at a conference was quite an experience. Having set up multiple WordPress sites over the years for my clients, and having some of them hacked – I decided to speak about common security measures.

Just before my talk, I had a bit of mess up as there was no projector connector for my Mac. As a precaution, I usually carry one always when speaking, but I missed it this time. However, the volunteers quickly got me up and running on another computer.

My talk started pretty well, but since this was being recorded – there were two spot lights shining brightly on my face – that didn’t let me analyse the audience’s reaction except the ones on the front row.

I had no more access to my speaker notes because I had exported my presentation to HTML format – but somehow I managed to have all my points covered since I had already prepared for the talk a couple of times.

Post the talk, the Q&A session evoked some questions and discussion as well. One discussion was about how having a different URL for login is not really a security tip, but more of a deterrence for developers working on the project later. Some folks in the audience suggested that it is better to give 444 permission to hidden files like .htaccess and wp-config, once they have been set up. This is the least amount of permission that can be given, and makes the file read-only.

I think I finished the talk on time, and then took some of the discussions off the stage.

I will share a video of the talk – once they have been uploaded by the team.

Geeking out at Wordcamp Nagpur – 2017

I was recently at the first WordCamp Nagpur – 2017 as a speaker, speaking on “Securing your WordPress Sites”. While Nagpur has had little tech events and community meet ups, this was one of the biggest tech events to happen in Nagpur’s tech scene for a while.

Spread across 2 days, 24th and 25th June – the WordCamp had lots of technical sessions as well as user focussed sessions. I am documenting a few ideas that I learnt at WCNagpur. Since I was more interested in the technical sessions, I spent most of the time at the developer tracks.

On 24th, I started with attending

Do it Yourself – Search Engine Optimisation by Ankit Jaiswal

This was a beginner’s level track setup for users new to WordPress on Search Engine Optimisation. Ankit explained the concept and history of Search Engines well and explained why SEO needs to be done.

He explained a few tips on SEO, and a few don’ts to the newcomer audience. From tackling clients who ask for Search results on the top, to using better names for images – Ankit’s session handled it in an easy way.

Rapid Application Development by Anirudha Prabhune.

This was a talk / workshop that focussed on how wpoets team handles complex applications using WordPress. It was a low-down on their plugin – Awesome Studio

Unit Testing Your Plugins – Manoj Khande

This was an interesting talk by Manoj Khande from Sanisoft. He ran us through PHPUnit test cases. While I have worked with unit testing before, Manoj shared a few tips after his talk, that focussed solely on doing better Test Driven Development. In his talk, he explained the WP Plugin Test Framework and unit tested the plugin. Unfortunately, I could not follow the instructions, despite Manoj’s preparedness – as PHPUnit refused to work on my machine.

Getting started with WP-CLI – Ajit Bohra

Ajit is one of the most prolific and entertaining speaker I have ever come across. He left the entire room in splits with almost every slide of his, and after a tiring day of workshops – that is what the audience needed. He explained WP-CLI and some use cases in how WP-CLI can be better used.

One tip that I picked up from him in this talk was when he used VVV, he uses the actual WordPress domain names to access the sites. This is such a nifty feature, and saves so much time – especially when migrating site from local to production environment. I have used Vagrant for quite a while now, so I was fairly surprised when I came across this, and wondered why it never struck me 🙂

Post the workshops, we had plugin pratyogita (contest) where 4 contestants shared their plugins and working prototypes with the entire team. We had Tarique Sani and Rahul Bansal judging the plugins and grilling the developers on their plugin.

That ended the first day of workshops at WordCamp – 2017. The stage was all set for the main event the next day.

On the 2nd day, we started a bit late due to some technical issues.

Using Vue.js with Rest API – Tushar Joshi

The first talk for the day was Using Vue.js with RestAPI. I have attended a couple of sessions on JS by Tushar Joshi before, and yet every time he manages to inspire me with the depth of knowledge he has about programming. He explained the REST API in detail, followed by a template Vue.js application. He went to great detail to make it welcoming and easy to understand for a developer who hasn’t experienced Vue.js before.

Alternative Development Techniques with WordPress – Amit Singh

Amit Singh from WPoets talked about Alternative Development Techniques with WordPress. His core point was that as a developer, your job has to solve problems and not write code. He shared an example application, and asked the developer crowd – how they would build it. And after receiving the answer, he explained how he would build it using existing plugins. The Q&A session evoked some debate on plugin bloat – where someone complained that it is better to write some code rather than have a plugin with many features that they may not use.

Managing WordPress Site as a Composer Project – Rahul Bansal

Rahul Bansal of rtCamp, explained his team’s composer and git strategy when it comes to delivering WordPress sites. This was one of the few sessions where code was demoed. He explained a few plugins, that make his development easy.

One of the tip that I learnt here was to have a different set of plugins for Dev as well as Production, which is something Composer lets us do.

Using WordPress for IOT – Vikram Kulkarni

Vikram Kulkarni’s talk revolved around the IoT product that he had built for agricultural soil analysis. His product is a classic case of not over engineering anything and building a prototype soon and fast. He explained that since he was quite comfortable with WordPress, instead of learning a new language to build backend for his product – he used WordPress to build the entire service.

He ended the talk with calls for contribution to WordPress that would benefit the IOT Community so more adopters can use WordPress.

Atomic Design in WordPress – Jayman Pandya

Jayman’s talk was the only design talk scheduled and it was quite an insightful one. He explained the concept of Atomic design, and lamented the fact that for us, when it comes to modularity or responsive – we often tend to go with a custom framework like Bootstrap or Foundation. He urged the audience to go with a new inventory of design items, and talked about how one can go about building a new design language. He gave a few tips on maintaining it as well.

Jayman’s talk also set up a few conversations in the Q & A session where some attendees talked about the feasibility of this not working out with clients who are not high on budget. Someone also pointed out that custom stylesheets on bootstrap can also be used to customise bootstrap according to your needs.

Securing your WordPress Sites – Karthik Ragubathy

Post Jayman’s talk was my talk on Securing WordPress sites. I talk about this in detail in my other post here.

Are you really a WordPress Developer? – Swapnil Patil

After my talk, it was Swapnil’s talk on “Are you really a WordPress Developer?” . Swapnil’s talk revolved around the term – “developer” and definitions around it. It was quite a controversial talk, and suffice to say, many people passionately brought their own angle to the talk, including yours truly.

Post this, I had to give a miss to two excellent talks on “Headless CMS – WordPress” and “Making themes and plugins ready for translation” as I had to head out. I am waiting for these talks to be up and running on

Making Money with WordPress Sites

I came back just in time for the final user panel on “Making Money with WordPress Sites” where Kaustubh Katdare, Rohit Langde and Omkar Bhagat talked about how they earn money via blogging. It was a free wheeling discussion and it was quite popular with the crowd.

After the panel, it was a general feedback session + time to end the event. Abhishek Deshpande thanked all the volunteers, sponsors, speakers and attendees and gave a good maxim on how to carry forward the community efforts.

Overall for a first event, the WordCamp was really well organised. Everything from the attendees kit to the food – was taken care of. Although there were quite a few glitches here and there, for a first time – the team did a great job. And I hope the community goes strong from here and plays an important role in introducing people to programming as well as WordPress.

As the folks at #wcnagpur would say, Jai WordPress.

Easier Webhook Testing with ngrok

Picture this scenario :

You have built a web app that has webhooks from other service. But everytime you want to test it out, you have to deploy your app, and keep a look out on your logs as the request trickles in. 


You are working with a mobile developer in-house, and you both want to test an API. You deploy the app, even if it is a minute change and then wait to see if all works properly. 

I am sure, as a developer, we have all been there. Deploying apps for a couple of small changes, and make it run through your entire CI/CD cycle can be exhausting. And so I was delighted, when I stumbled upon ngrok.

ngrok is a tool that allows you to access your localhost via Internet. This can save you from countless hours of frustrations by avoiding deploying code for minor changes.

Running ngrok is as simple as downloading the relevant files from the website. Once downloaded, on Mac / Linux – type

unzip /path/to/

Once you have unzipped the contents, ensure your localhost is running and you know the port number. To get your server accessible via Internet, type

./ngrok http your-port-number

This would start ngrok, and now your web app is publicly accessible via the Internet. The terminal also shows the incoming request to your app.

The Tunnel URL in my case is, which points to port 8000 on my device. You can now update the tunnel URL in your webhooks, or pass it on to your client to test the app.

One cool feature of ngrok is it’s live dashboard – which allows you to inspect the status of server as well as the requests that are coming in. You can access this dashboard by navigating to http://localhost:4040/inspect/http

Apart from using ngrok for development, you can also use it to host your own webmail or other apps. ngrok also offers a paid tier with support for custom domains and other features.

ngrok has been easily one of the tools that has saved me hours of development time, and the paid plan is just worth it.

When CURL fails

Every once in a while, as a programmer, you seemingly come upon a problem that flummoxes you, irritates you and takes you through a wide array of emotions – and in the end when you solve it, makes you look back and think, why did this issue even take so much time to fix.

Here’s one from my diaries: I was working for a client which dealt with lots of financial data, building an app which dealt with peer to peer wallet system and phone recharges. The app’s backend API was built using a custom PHP framework.

After having built the system, I went to my client’s office to deploy the system on their homegrown servers. One thing that I should mention is, being a company that handles lot of financial data – the entire campus was very secure. Regular audits, Access only to authorised personnel. But what struck me odd was that even the computers were not connected to Internet – A team of 20 programmers shared two computers that connected to Internet, which they used when they wanted to research the issues they were working on.

Having given a machine, I installed PHP on the servers and deployed my app – so I can start testing. This is where all the troubles started for me. For one of the methods, I had written some code to call client’s API to recharge their phone – but just no matter what happened, the API wouldn’t work.

Looking at the code, the code was written in a simple CURL snippet

// create a new cURL resource$ch = curl_init();
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, "");
curl_setopt($ch, CURLOPT_HEADER, 0);
// grab URL and execute it
// close cURL resource, and free up system resources

But now here’s the funny part – the same URL was accessible via the browser, postman clients and everywhere. But through the code, it would simply refuse to work.

After looking into the logs – and learning more about CURL and how it works – I added the curl_error function to see what is failing this bit of code.

Error 7 – Bad request. Could not connect to host. 

Ah. So now we were getting somewhere. But there could be multiple reasons why we could not connect to the host – network dropping out, Internet not accessible and so on.

That’s when it hit me like a flash of light – The entire computers at the client’s office were passing through a proxy. Now the URLs worked perfectly because the browsers were configured. I opened a terminal and typed

curl -v 

And truly enough – I got Error 7 again now. With a bit of more research – I was able to get it working. Adding these two lines to my code with the correct credentials got the code working.

curl_setopt($ch, CURLOPT_PROXY, “proxyurl:proxyport”);
curl_setopt($ch, CURLOPT_PROXYUSERPWD, “username:password”);

And I ended my 12 hours of debugging this issue – being the more wiser about proxies and a deep understanding of CURL.

Homo Deus : A Brief History of Tomorrow by Yuval Noah Harari

Buy on Amazon

“Having reduced mortality from starvation, disease and violence, we will now aim to overcome old age and even death itself. Having saved people from abject misery, we will now aim to make them positively happy. And having raised humanity above the beastly level of survival struggles, we will now aim to upgrade humans into gods, and turn Homo sapiens into Homo deus.”

Yuval Noah Harari’s book Homo Deus is a book that doesn’t shy of making bold predictions about the future of humanity. Harari captures in detail what an apocalyptic future might look like centuries or even decades from now.

The book starts with detailing humanities’ current and past conditions, and slowly builds up to what will happen to us in future. In a year that saw Brexit, Trump – one tends to balk when he suggests that the world is a better place than what it was before, but YNH backs it up with solid proof. He points out that sugar is more dangerous today than gunpowder, and that more people die due to obesity than wars.

He effectively downplays the effect of a single political leader in solving global problems or to set a new path ahead for humanity.

The wildest dreams of Kim Jong-un and Ali Khamenei don’t go much beyond atom bombs and ballistic missiles: that is so 1945. Putin’s aspirations seem confined to rebuilding the old Soviet zone, or the even older tsarist empire. Meanwhile in the USA, paranoid Republicans accuse Barack Obama of being a ruthless despot hatching conspiracies to destroy the foundations of American society – yet in eight years of presidency he barely managed to pass a minor health-care reform.

He opines that the future is going to be controlled by billionaires who have tunnel vision about solving the problem plaguing humanity, namely – mortality and building conscious artificial intelligence systems. He terms this as “Dataism” – under which devices monitor everything about us. We will be connected to a central entity – which will suggest us better things, better tasks and eventually better mates.

But here is where it gets extra scary ( or comfortable – based on how you look at it), he suggests that what Homo Sapiens did to animals in the name of industrialisation (food production, rearing, etc) – will come back to haunt Homo Sapiens. He predicts, one day, all of us will be just biochemical beings who will be informed by this vast network of algorithms about how we feel from second to second.

My Review:

I think YNH makes a few bold claims in the book, and all of these ideas need to be taken with a pinch of salt.

For starters, he downplays the effect of billionaires who are out to solve global problems like diseases and inequality – The Gates Foundation’s efforts suggest otherwise.

He seems so optimistic about the IoT devices and, yet they have flaws that makes one wonder if they will ever reach the consciousness level. For ex:  late last year, there was a huge DDoS attack that happened due to these very devices, and well some of them are not really that smart.

But on the other hand, he does make a compelling point about how algorithms are ruling our lives and are taking away our jobs. Countries are now exploring basic income plans as we find ourselves marginalised because of faster robots and smaller silicon chips.

The entire chapter, where he demolishes the idea of consciousness is eerily captivating and there are many places throughout the book where his ideas will convince you in a similar fashion.

A fast-paced non-fiction, and definitely a must read book.


In one of the recent projects that I have been working on, I had to look for a word in a phrase – but the challenge was to rank an occurrence of the phrase higher when it’s an individual word as compared to when it was part of a word.

For ex : Food Blogger should rate higher than Ardent Foodie, because in the first example – Food is an individual word as compared to latter where it is part of a whole word.

At times like this, I find MySQL’s REGEXP quite useful, as I can use different regular expressions to fit my criteria. So here’s what I did here

I first looked for an individual occurrence of a word.

select * from table_name where search_term REGEXP '[[:<:]]food[[:>:]]';

Here, [[:<:]] & [[:>:]] are markers that stand for word boundaries. They match the beginning and end of words, respectively. Now, as a next step, I follow it by this query.

select * from table_name where search_term LIKE '%food%';

But when the first case happens, it gets an higher priority as compared to the second case – so I reworked the query to this

select table_name.title, (case when search_term REGEXP '[[:<:]]food[[:>:]]'  
then 100 when search_term LIKE '%food%' then 99 else 1 end) as priority  
from table_name order by `priority` desc;  

That way, results that have the word as a standalone rank higher up as compared to when they are part of a phrase.

Deploying with Deploybot

I vividly remember the first time, I messed up a production server. It was my early days of being a programmer, and we had got our first client.

Back then, my deployment strategy was basically to upload files using FTP and then run any commands on the server via the shell. During a routine deployment, I noticed a file which remained in the server, and in trying to remove it, I typed sudo rm -rf /.

On the Production.

I watched the next few minutes in horror as the entire client’s machine was wiped clean and the site went down. Fortunately, my client was understanding, and we had backups – so there was not much of damage – but I had to spend the next 3 days fixing the mess (and contemplating if I am really cut out for this job.)

My biggest learning from the incident was to be very careful when on Production. Over the time, I learned Git and other tools, which made deployments more easier and safer. As someone developing in Laravel, and leading a team of Laravel developers – I am always on the look out to make deployments easier.

And I have tried everything from custom bash scripts to git workflows, where we would git pull on server. None of them however stuck primarily due to the complexities they bought in

And after much experimentation – my team and I zeroed down to DeployBot.

DeployBot allows you to deploy code from anywhere. It takes your code from your Github / Bitbucket or self hosted Git repositories and deploys to any server. At QICE, we primarily use Digital Ocean and AWS – both of which are supported by DeployBot and make it an ease to integrate in our projects.

Here’s how DeployBot has helped us

Continuous Deployment

Over the day, we make 2-3 deployments to our sandboxes on certain projects. And these are fairly large commits. DeployBot seamlessly gets the new commits and automatically ( or manual for production setups ) deploys the latest files to the server.

My team now does not have to worry about deploying to server. All we have to do is push to a branch, and we know it will end up being on the server.


Despite much preparation, there are moments, when things don’t work on the production for weird reasons. Deploybot has a rollback to a specific version feature, which is quite nifty at times like these.

Pre and Post Deployment Commands

After deployment, we run a few commands ex : Gulp, Migrations and Composer updates.

Deploybot allows us to specify what commands to run before and after deployment. That means, more developer peace and not worrying about switching to server and typing in each command on production machines.

Modifying Configuration Files

Even after all this, you may have to sometimes go to the server to edit your configuration files.

Deploybot eliminates this as well, by asking you to enter your configuration files. Just ensure all the changes are in your configuration file before you deploy, and they are deployed in your next deployment.


Pretty much every web app these days has Slack / Email integration – and so does Deploybot. It notifies us everytime there is a deployment in our Slack channels.

No more informing the entire team that the production is done and they can resume their work.

Amazing Support & Reliability

This is something of importance to us. In the past one year, that we used Deploybot, we faced a downtime of exactly 1 minute, where we couldn’t deploy to production. We reached out to Support and got a reply back within the next minute telling us that the issue has been fixed.

Thanks to Deploybot, my team and I can now focus on building stuff than worrying about getting it to our customers. If you are into developing web apps, it is an invaluable part of your toolset and takes care of all your deployment worries.

Autosaving in XCode

While working on a Swift Project, I spent ~ 3-5 hours adjusting a few pop-ups and fine tuning design for a screen. As it so happens, while working in the zone, I missed any committing and committed the code after my entire sprint.

Now, the other day, I was helping out a colleague with git and explaining the perils of git checkout. As it turns out, my brain decided to remind myself of the command exactly while committing and I ended up typing

git checkout . (instead of git add .)
git commit -m "My Commit Message" 
git push origin feature-branch

I didn’t realize my mistake until I came back to XCode and notice all the 4+ hours of work totally undone. Panicked, my first reaction was to find out how to undo files from an accidental checkout. Turns out, you cannot. Unless you have ever used git add or git stash on the files, there’s nothing that can be done.

I was about to lose hope when I saw an answer on SO on looking through the IDE’s auto save files. A bit more searches later – turns out OS X does auto-save files.

Gladly, I was able to get back the entire code in an auto-saved version 2-3 hours back and the day was saved thanks to auto-save.

However this is a lesson on reinforcing strict programming / source control habits.

  • Always think once before you use . operator – And twice before using git checkout.
  • Make sure you add / stash your code in smaller task increments. finished a small task – add to the index. Going to work later?, stash it