Easier Webhook Testing with Ngrok

Picture this scenario :

You have built a web app that has webhooks from other service. But everytime you want to test it out, you have to deploy your app, and keep a look out on your logs as the request trickles in. 

or

You are working with a mobile developer in-house, and you both want to test an API. You deploy the app, even if it is a minute change and then wait to see if all works properly. 

I am sure, as a developer, we have all been there. Deploying apps for a couple of small changes, and make it run through your entire CI/CD cycle can be exhausting. And so I was delighted, when I stumbled upon ngrok.

ngrok is a tool that allows you to access your localhost via Internet. This can save you from countless hours of frustrations by avoiding deploying code for minor changes.

Running ngrok is as simple as downloading the relevant files from the website. Once downloaded, on Mac / Linux – type

Once you have unzipped the contents, ensure your localhost is running and you know the port number. To get your server accessible via Internet, type

This would start ngrok, and now your web app is publicly accessible via the Internet. The terminal also shows the incoming request to your app

The Tunnel URL in my case is http://b3991383.ngrok.io, which points to port 8000 on my device. You can now update the tunnel URL in your webhooks, or pass it on to your client to test the app.

One cool feature of ngrok is it’s live dashboard – which allows you to inspect the status of server as well as the requests that are coming in. You can access this dashboard by navigating to http://localhost:4040/inspect/http

 

 

Apart from using ngrok for development, you can also use it to host your own webmail or other apps. ngrok also offers a paid tier with support for custom domains and other features.

ngrok has been easily one of the tools that has saved me hours of development time, and the paid plan is just worth it.

Posted in Programming | Tagged , , , | Leave a comment

When CURL fails

Every once in a while, as a programmer, you seemingly come upon a problem that flummoxes you, irritates you and takes you through a wide array of emotions – and in the end when you solve it, makes you look back and think, why did this issue even take so much time to fix.

Here’s one from my diaries: I was working for a client which dealt with lots of financial data, building an app which dealt with peer to peer wallet system and phone recharges. The app’s backend API was built using a custom PHP framework.

After having built the system, I went to my client’s office to deploy the system on their homegrown servers. One thing that I should mention is, being a company that handles lot of financial data – the entire campus was very secure. Regular audits, Access only to authorised personnel. But what struck me odd was that even the computers were not connected to Internet – A team of 20 programmers shared two computers that connected to Internet, which they used when they wanted to research the issues they were working on.

Having given a machine, I installed PHP on the servers and deployed my app – so I can start testing. This is where all the troubles started for me. For one of the methods, I had written some code to call client’s API to recharge their phone – but just no matter what happened, the API wouldn’t work.

Looking at the code, the code was written in a simple CURL snippet

But now here’s the funny part – the same URL was accessible via the browser, postman clients and everywhere. But through the code, it would simply refuse to work.

After looking into the logs – and learning more about CURL and how it works – I added the curl_error function to see what is failing this bit of code.

Error 7 – Bad request. Could not connect to host. 

Ah. So now we were getting somewhere. But there could be multiple reasons why we could not connect to the host – network dropping out, Internet not accessible and so on.

That’s when it hit me like a flash of light – The entire computers at the client’s office were passing through a proxy. Now the URLs worked perfectly because the browsers were configured. I opened a terminal and typed

curl -v www.google.com 

And truly enough – I got Error 7 again now. With a bit of more research – I was able to get it working. Adding these two lines to my code with the correct credentials got the code working.

And I ended my 12 hours of debugging this issue – being the more wiser about proxies and a deep understanding of CURL.

Posted in Programming | Leave a comment

Homo Deus : A Brief History of Tomorrow by Yuval Noah Harari

Buy on Amazon

“Having reduced mortality from starvation, disease and violence, we will now aim to overcome old age and even death itself. Having saved people from abject misery, we will now aim to make them positively happy. And having raised humanity above the beastly level of survival struggles, we will now aim to upgrade humans into gods, and turn Homo sapiens into Homo deus.”

Yuval Noah Harari’s book Homo Deus is a book that doesn’t shy of making bold predictions about the future of humanity. Harari captures in detail what an apocalyptic future might look like centuries or even decades from now.

The book starts with detailing humanities’ current and past conditions, and slowly builds up to what will happen to us in future. In a year that saw Brexit, Trump – one tends to balk when he suggests that the world is a better place than what it was before, but YNH backs it up with solid proof. He points out that sugar is more dangerous today than gunpowder, and that more people die due to obesity than wars.

He effectively downplays the effect of a single political leader in solving global problems or to set a new path ahead for humanity.

The wildest dreams of Kim Jong-un and Ali Khamenei don’t go much beyond atom bombs and ballistic missiles: that is so 1945. Putin’s aspirations seem confined to rebuilding the old Soviet zone, or the even older tsarist empire. Meanwhile in the USA, paranoid Republicans accuse Barack Obama of being a ruthless despot hatching conspiracies to destroy the foundations of American society – yet in eight years of presidency he barely managed to pass a minor health-care reform.

He opines that the future is going to be controlled by billionaires who have tunnel vision about solving the problem plaguing humanity, namely – mortality and building conscious artificial intelligence systems. He terms this as “Dataism” – under which devices monitor everything about us. We will be connected to a central entity – which will suggest us better things, better tasks and eventually better mates.

But here is where it gets extra scary ( or comfortable – based on how you look at it), he suggests that what Homo Sapiens did to animals in the name of industrialisation (food production, rearing, etc) – will come back to haunt Homo Sapiens. He predicts, one day, all of us will be just biochemical beings who will be informed by this vast network of algorithms about how we feel from second to second.

My Review:

I think YNH makes a few bold claims in the book, and all of these ideas need to be taken with a pinch of salt.

For starters, he downplays the effect of billionaires who are out to solve global problems like diseases and inequality – The Gates Foundation’s efforts suggest otherwise.

He seems so optimistic about the IoT devices and, yet they have flaws that makes one wonder if they will ever reach the consciousness level. For ex:  late last year, there was a huge DDoS attack that happened due to these very devices, and well some of them are not really that smart.

But on the other hand, he does make a compelling point about how algorithms are ruling our lives and are taking away our jobs. Countries are now exploring basic income plans as we find ourselves marginalised because of faster robots and smaller silicon chips.

The entire chapter, where he demolishes the idea of consciousness is eerily captivating and there are many places throughout the book where his ideas will convince you in a similar fashion.

A fast-paced non-fiction, and definitely a must read book.

Posted in Books | Leave a comment

REGEXP with MySQL

In one of the recent projects that I have been working on, I had to look for a word in a phrase – but the challenge was to rank an occurrence of the phrase higher when it’s an individual word as compared to when it was part of a word.

For ex : Food Blogger should rate higher than Ardent Foodie, because in the first example – Food is an individual word as compared to latter where it is part of a whole word.

At times like this, I find MySQL’s REGEXP quite useful, as I can use different regular expressions to fit my criteria. So here’s what I did here

I first looked for an individual occurrence of a word.

select * from table_name where search_term REGEXP '[[:<:]]food[[:>:]]';

Here, [[:<:]] & [[:>:]] are markers that stand for word boundaries. They match the beginning and end of words, respectively. Now, as a next step, I follow it by this query.

select * from table_name where search_term LIKE '%food%';

But when the first case happens, it gets an higher priority as compared to the second case – so I reworked the query to this

That way, results that have the word as a standalone rank higher up as compared to when they are part of a phrase.

Posted in Programming | Leave a comment

Deploying with Deploybot

I vividly remember the first time, I messed up a production server. It was my early days of being a programmer, and we had got our first client.

Back then, my deployment strategy was basically to upload files using FTP and then run any commands on the server via the shell. During a routine deployment, I noticed a file which remained in the server, and in trying to remove it, I typed sudo rm -rf /.

On the Production.

I watched the next few minutes in horror as the entire client’s machine was wiped clean and the site went down. Fortunately, my client was understanding, and we had backups – so there was not much of damage – but I had to spend the next 3 days fixing the mess (and contemplating if I am really cut out for this job.)

My biggest learning from the incident was to be very careful when on Production. Over the time, I learned Git and other tools, which made deployments more easier and safer. As someone developing in Laravel, and leading a team of Laravel developers – I am always on the look out to make deployments easier.

And I have tried everything from custom bash scripts to git workflows, where we would git pull on server. None of them however stuck primarily due to the complexities they bought in

And after much experimentation – my team and I zeroed down to DeployBot.

DeployBot allows you to deploy code from anywhere. It takes your code from your Github / Bitbucket or self hosted Git repositories and deploys to any server. At QICE, we primarily use Digital Ocean and AWS – both of which are supported by DeployBot and make it an ease to integrate in our projects.

Here’s how DeployBot has helped us

Continuous Deployment

Over the day, we make 2-3 deployments to our sandboxes on certain projects. And these are fairly large commits. DeployBot seamlessly gets the new commits and automatically ( or manual for production setups ) deploys the latest files to the server.

My team now does not have to worry about deploying to server. All we have to do is push to a branch, and we know it will end up being on the server.

Rollbacks

Despite much preparation, there are moments, when things don’t work on the production for weird reasons. Deploybot has a rollback to a specific version feature, which is quite nifty at times like these.

Pre and Post Deployment Commands

After deployment, we run a few commands ex : Gulp, Migrations and Composer updates.

Deploybot allows us to specify what commands to run before and after deployment. That means, more developer peace and not worrying about switching to server and typing in each command on production machines.

Modifying Configuration Files

Even after all this, you may have to sometimes go to the server to edit your configuration files.

Deploybot eliminates this as well, by asking you to enter your configuration files. Just ensure all the changes are in your configuration file before you deploy, and they are deployed in your next deployment.

Notifications

Pretty much every web app these days has Slack / Email integration – and so does Deploybot. It notifies us everytime there is a deployment in our Slack channels.

No more informing the entire team that the production is done and they can resume their work.

Amazing Support & Reliability

This is something of importance to us. In the past one year, that we used Deploybot, we faced a downtime of exactly 1 minute, where we couldn’t deploy to production. We reached out to Support and got a reply back within the next minute telling us that the issue has been fixed.

Thanks to Deploybot, my team and I can now focus on building stuff than worrying about getting it to our customers. If you are into developing web apps, it is an invaluable part of your toolset and takes care of all your deployment worries.

Posted in Programming | Leave a comment

Autosaving in XCode

While working on a Swift Project, I spent ~ 3-5 hours adjusting a few pop-ups and fine tuning design for a screen. As it so happens, while working in the zone, I missed any committing and committed the code after my entire sprint.

Now, the other day, I was helping out a colleague with git and explaining the perils of git checkout. As it turns out, my brain decided to remind myself of the command exactly while committing and I ended up typing

I didn’t realize my mistake until I came back to XCode and notice all the 4+ hours of work totally undone. Panicked, my first reaction was to find out how to undo files from an accidental checkout. Turns out, you cannot. Unless you have ever used git add or git stash on the files, there’s nothing that can be done.

I was about to lose hope when I saw an answer on SO on looking through the IDE’s auto save files. A bit more searches later – turns out OS X does auto-save files.

Gladly, I was able to get back the entire code in an auto-saved version 2-3 hours back and the day was saved thanks to auto-save.

However this is a lesson on reinforcing strict programming / source control habits.

  • Always think once before you use . operator – And twice before using git checkout.
  • Make sure you add / stash your code in smaller task increments. finished a small task – add to the index. Going to work later?, stash it
Posted in Programming | Leave a comment