I passed out from IIT Bombay in 2009 with a Mechanical Engineering Degree.
I worked for a year making steel in SAIL (Durgapur). Later on, I moved to teach Physics to IIT aspiring students.
After teaching for around 3 years, I moved to Mumbai in 2014 to work on a startup with my friend.
I spent around a year on that project but had to close it down.
After that I joined another start-up Taskbob (on-demand home services startup) as a Category manager.
Then I moved to head Product & Data at Taskbob. During that time I worked on data quite a lot analyzing the product and business data.
After leaving Taskbob I started working on Deep learning projects as a consultant.
I worked with Transporter startup on their route optimization problems.
Then I joined Brillio as a senior lead data scientist.
Here I worked mainly on Neural Networks. I worked with their client in the Medical domain on their image-related problem.
Then, I worked on a scoring engine for project health which included information extraction from unstructured data (NLP).
I also worked on building an enterprise-level face recognition application for security application.
One of my major works in Brillio was building a recommendation engine based on ‘visual similarity’.
This was difficult as the similarity is a subjective concept and so we had to come up with a way to make it objective.
Currently I am working in Sigtuple trying to automate pathological tests end to end.
I worked as Mechanical engineer for one year (in SAIL) but then moved to the teaching domain.
In 2014 I got into the startup ecosystem.
It is then I got involved with data analysis and during my work in Taskbob as Head of product & Data I spent a lot of time understanding
various techniques. I was very good at maths and when I started learning machine learning theory I got connected to it and
realized that I could really excel in this field. To speed up the learning I took many Udacity courses and
did extra reading to grasp everything.
I read 50+ research papers in a very short time. The more I got into it the more I felt satisfied.
The most important part of Deep learning is it learns the features by itself.
In traditional Machine learning, one has to do feature engineering and then performance depends on your understanding of data and feature engineering skills.
When the problem is complex one may not be able to come up with sufficiently good features.
Here Deep learning becomes helpful to extract features from raw data. That’s why you see that deep learning is working better in almost all domains.
While in Transporter I was working on unsupervised learning and on route optimization.
After I joined Brillio I have mostly worked on supervised learning which included mainly Deep learning (computer vision and NLP).
In Sigtuple it almost Computer Vision deep learning (weakly-supervised).
This is difficult to point out but I like the YOLO series algorithm.
Their inventor has done a fantastic job of building a single-phase object detector.
Also, I very much use Focal loss in my training. This has always given better results.
I have also modified it to take care of some of its shortcoming.
One can compare Artificial Neural Network models with the Human Brain from the
perspective of neurons as How in Humans various neurons are connected and signals from these neurons pass the information and based on it the brain acts.
In the same way, ANNs have neurons connected in chained fashion and information flow is happening.
But Artificial Neural Network are nowhere close to how the brain works.
ANNs need a lot of data to learn while we humans don’t.
ANNs do not learn the structure of the world as a whole.
ANNs trained on one specific task cannot perform a different task but we humans can extrapolate our understanding of the world and
perform tasks even if we have not been trained for it.
In real-world Text+Voice constitute 70-80 % part of our life,
so NLP has much more application than computer vision.
In the Domains like Law, NLP is being applied to help the lawyers prepare a strong case for its clients by extracting relevant points
from past orders and laws. Again, Chatbots in customer service is the biggest application of NLP.
To all the students, if they want to be above average data scientist then they should focus on a few things:
1. Learn the concepts in-depth and not just the procedure of applying algorithms using any library.
2. Get a depth in one of the algorithms and then others will be easy to grasp.
3. If you haven’t done any course at college which is mostly the case then invest on yourself by taking very good course online.
They are structured and so they speeds up your learning process.
4. Avoid copy-pasting from the internet in your initial years of data science.
It does not help much. Try to write your own code and understand how things are working.
5. Keep learning new things. But don’t try to learn everything in one goes.
I like to cook and since last two years, I am cooking daily.
On the weekend there’s always something special.
I am not a book lover so I haven’t read many books but I do read things related to my domain. I am always up to date.