Is AI/ML still a buzzword?
Academia vs Industry — ?
Over the past couple of years, the advancements have been quite astonishing. From finding and recognising digits and suggesting Youtube videos to creating an imaginary image through text, we can come a lot far from the world of Linear Regression. A lot of commitments by the researchers have really exponentially increased the game. As we look into the genre of AI/ML, the world is quickly converging to pandas, Keras, TPUs or GPUs. What makes it so popular?
- Free access to lectures straight from the founders of ML/DL is helpful in terms of reaching the students who want to learn and implement the basics. It is now open for experimentation and not confined to research labs or industry giants.
- Access to cloud GPUs and TPUs on Google Colab comes with a pre-setup environment and libraries that gives you an off-the-shelf playground with all the dependencies, a neat UI to import files and fiddle with python code with support to extensive python libraries. It’s your jupyter notebook with the power of HULK! I personally recommend checking out Colab features.
- Kaggle competitions flourish in the enrichment of a competitive environment with a focus on implementing solutions to near real-world problems. It is a platform where you have well-defined problems, with a neat and clean dataset on which you can just let your algorithms work and get some prominent results. The roadmap is perfect for the gamification aspect, which makes the challenges really fun. Kaggle has also introduced kernels which are powered by GPU and TPU, so an inbuilt platform that can power your initial research to train and execute.
- At last, it boils down to the demand vs supply metrics. The industrial pool is still on the hunt for experienced leaders who can take this black box and add something valuable to the existing manufacturing pipeline of decision-making. Decision-making powered by intrinsic dependencies of data in the inter-dimensional analysis can bore fruitful results for the companies.
It is now time to discuss the commons and differences between Academia and the industry in terms of how they visualise AI/ML in terms of their end goals:
- Academicians are focused on improving results and benchmarks to find value in the guise of efforts that can lead to productive human employment. They explore every possible domain to uncover the hidden existence which is deprived of AI touch. Research is focused both vertically and horizontally- Vertically in terms of improving results and horizontally on expanding the umbrella to different domains, be it medical, algorithmic etc. Meanwhile, industries value profit and features that can establish them as outstanding contenders in the market. Product companies tend to gain insights and pitch the best to the customers. They want the product to be better utilised using the feedback loop. To sum up, the end goals are drastically different but tied with the thread of human usage.
- Academicians tend to have hands-on research labs and AI tools that are customised and fine-tuned to the demands of experimental work. Companies rely on open source or paid tools to gather interesting data points to further boost the business aspects and catch hold of the pain points. e.g. these data points can be to study and understand the correlation of data with the consumers and recommender systems.
- The outcome of academic studies facilitates industrial advancements. Though industries employ R&D facilities and publish white papers to highlight the work going on behind the doors but their focus of research is confined to the vision of the company. e.g. It can be a team of engineers who are developing new smartphone image processing algorithms that use AI to stabilise pixels or add filters, etc. In academics, many unknowns are explored, and the results are fine-tuned to an expected result and then reveal the power of it. An example, let’s consider each research paper as a directed graph. The base/source vertices will be the research papers which have been initially explored, and which lay the foundation of the domain. These vertices will point to a new layer of research papers much like a layer of neural networks. These layer-by-layer research-driven established results are used by the industry to build world-class products around it. We can say the academics are the Carnot Engines and the industry is the Ford!
- The common between the two is the team that is focused and agile towards the retrospection of patterns that converge to some advantageous utility. These patterns can appear anywhere during the process, a keen observer needs to be a lens holder and ready to peel inside layer by layer.
These were some of the observations which I derived. Feel free to share!