{WAREHOUSE OPTIMIZATION USING AI @ DHL}

Artificial Intelligence is creating waves of disruption across many industries, be it manufacturing or human resources (HR). One of the major industries which AI has penetrated today is supply chain and logistics. All huge logistics and supply chain organizations have begun embarking on such journeys with Machine Learning and AI helping them across different use cases across the industry. Experts say that by 2020, AI could be completely transforming warehouse operations, with improvements in efficiency, profits and targets. The warehouse powered by AI would become more responsive and dynamic.

With the above in mind, we began our journey with DHL and were called in by the operations department to help them understand how AI can help them power their warehouse operations.

We conducted a half-day workshop to help them understand Supply Chain 4.0, the various types of AI and Machine Learning and walked them through various algorithms (like neural networks, genetic algorithms) and machine learning models trained on real DHL warehouse data, the interpretation of its results and how these results can be effective in optimizing their operations multi-fold.

The participants and the organizing team found this to be very useful and implementable in their day to day operations!

Get in touch with us for such customized trainings and to embark on the data science and AI journey!

PYTHON VS R – THE BURNING QUESTION

R and Python are both open-source programming languages with a large community. They are very popular among data analysts. New libraries or tools are added continuously to their respective catalog. R is mainly used for statistical analysis while Python provides a more general approach to data science.

While Python is often praised for being a general-purpose language with an easy-to-understand syntax, R’s functionality is developed with statisticians in mind, thereby giving it field-specific advantages such as great features for data visualization. Both R and Python are state of the art in terms of programming language oriented towards data science and hence learning both of them is, of course, the ideal solution. But R and Python require a time-investment, and such luxury is not available for everyone.

Let us see how these two programming languages relate to each other, by exploring the strengths of R over Python and vice versa and indulging in basic comparison between these two.

Python can do almost all the tasks that R can, like data wrangling, engineering, feature selection, web scraping and so on. But Python is known as a tool to deploy and implement machine learning at a large-scale, as Python codes are easier to maintain and remains more robust than R. The programming language is up to date with many data learning and machine learning libraries. It provides APIs for machine learning or AI. Python is also usually the first choice when there is a need to use the results of any analysis in an application or a website.

R has been developed by academicians and statisticians in over 2 decades. It is now one of the richest ecosystems to perform data analysis.  Around 12000 packages are available in CRAN (open-source repository) now. A rich variety of libraries can be found for any analysis one needs to perform, making R the first choice for statistical analysis, especially for specialized analytical work.

One major difference between R and other statistical tools or languages is the output. Other than R, there are very good tools to communicate results and make presentation of findings easy. In R, Rstudio comes with the library knitr which helps with the same, but other than that it lacks the flexibility for presentation.

R and Python Comparison

Parameter R Python
Objective Data analysis and statistics Deployment and production
Primary Users Scholar and R&D Programmers and developers
Flexibility Easy to use available library Easy to construct new models from scratch. I.e., matrix computation and optimization
Learning curve Difficult at the beginning Linear and smooth
Popularity of Programming Language. Percentage change 4.23% in 2018 21.69% in 2018
Average Salary $99.000 $100.000
Integration Run locally Well-integrated with app
Task Easy to get primary results Good to deploy algorithm
Database size Handle huge size Handle huge size
IDE Rstudio Spyder, Ipthon Notebook
Important Packages and library tydiverse, ggplot2, caret, zoo pandas, scipy, scikit-learn, TensorFlow, caret
Disadvantages Slow High Learning curve Dependencies between library Not as many libraries as R
Advantages
  • Graphs are made to talk. R makes it beautiful
  • Large catalog for data analysis
  • GitHub interface
  • RMarkdown
  • Shiny
  • Jupyter notebook: Notebooks help to share data with colleagues
  • Mathematical computation
  • Deployment
  • Code Readability
  • Speed
  • Function in Python

Source: https://www.guru99.com/r-vs-python.html

The Usage

As mentioned before, Python has influential libraries for math, statistics and Artificial Intelligence. While Python is the best tool for Machine Learning integration and deployment, the same cannot be said for business analytics.

R, on the other hand, is designed by experts to answer statistical problems. It can also solve problems on machine learning and data science. R is preferred for data science due to its powerful communication libraries. It is also equipped with numerous packages to perform time series analysis, panel data and data mining. But R is known to have a steep learning curve and therefore is not recommended for beginners.

As a beginner in data science with necessary statistical knowledge, it might be easier to use Python and to learn how to build a model from scratch and then switch to the functions from the machine learning libraries. R can be the first choice if the focus is going to be on statistics.

In conclusion, one needs to pick the programming language based on the requirements and available resources. The decision should be made based on what kind of problem is to be solved, or the kind of tools that are available in the field.

CAN AI SAVE MARINE LIFE?

In June 2018, a rescue attempt to save a pilot whale who was struggling to swim and breathe failed in Thailand. The autopsy revealed that the whale had more than 17 pounds of plastic bags and other trash clogging its stomach, which had made it difficult for the animal to digest food. Such examples have become fairly common nowadays.

According to UNESCO’s Facts and figures on marine biodiversity, by 2100, without major changes, more than half of all marine species will be at risk of extinction. Oceans contain the greatest diversity of life on Earth and protecting them has become one of the world’s most important challenges. One of the latest powers to take charge in this field is Artificial Intelligence.

How does Artificial Intelligence play a role?

AI can and is being used to complete tasks that are usually done manually by researchers, like identifying individual animals from photos for population studies to categorizing the many millions of camera trap photos gathered by field scientists. The big data that is received through advanced technologies like machine learning can be analyzed for various purposes. New advances in satellite observation, open data and machine learning now allow us to process the massive amounts of data being produced. Here are some examples where smart technology is making a difference.

In 2014, researchers at the University of California, San Diego, (UCSD) built an artificially intelligent camera intended to monitor and track endangered species. The submergible system records video when it hears a vocalization created by a marine animal and produces relevant data for biological monitoring. The SphereCam is powered by an Intel Edison compute module which allows for flexibility, energy efficiency and easy programming. In addition, the module can run on batteries for up to a week and fit inside the camera, preventing it from getting wet.

Global Fishing Watch uses satellite-based monitoring to track all fishing vessels in real-time to protect fisheries around the world. They collect satellite imagery and analyze boats’ movements with a specially designed form of machine learning to determine if they are fishing or sea-fearing vessels. Then they post all fishing boat data to their open source website, so that researchers, law enforcement agencies and the public can keep watch on key trends such as frequency and monitor if any fishing boats venture into protected waters.

Google has teamed up with Queensland University to create a detector powered by machine learning, which can automatically identify manatees (or sea cows) in ocean images. This detector can save the time, energy and resources that researchers spend while sifting through thousands of aerial photos to spot the animals, as the image recognition system will do this work for them.

How do all these devices and information help?

Information such as this helps conservationists track populations, identify the results of human interventions in manatee habitats and can play a key role in managing the future of this endangered species. Similar software can also be developed to track other marine life as well, such as humpback whales and other ocean mammals.

Researchers are also using computational sustainability — the ability to analyze Big Data and make predictions based on trends — to understand marine life and find solutions.

In partnership with Microsoft, The Nature Conservancy has combined traditional, academic research with cloud and AI technologies to map, in high resolution, ocean wealth. By evaluating the economic value of ocean ecosystem services- such as carbon storage and fishing activity- it will make better conservation and planning decisions possible.

IBM recently announced a new AI-power microscope capable of detecting plankton’s behavior, to predict the health of the environment. In a few years, IBM anticipates that these small autonomous AI microscopes will be networked in the cloud and deployed around the world, continually collecting information that will provide tremendous insight into the health and operation of the ocean’s complex ecosystem.

Combining AI and robotic technology could also reduce illegal activity by poachers through tracking as well as reduce emissions from cargo ships by suggesting the best routes. “Not only will a healthy ocean benefit its inhabitants, but the entire human race. AI is thus the next step”, writes Megan Ray from the Women of Green.

While AI is disrupting numerous industries and transforming lifestyle, it could also be a smart way to save marine life.

This content is not for distribution. Any use of the content without intimation to its owner will be considered as violation.

AI POWERED WAREHOUSE OPTIMIZATION

Artificial Intelligence is creating waves of disruption across many industries, be it manufacturing or human resources (HR). One of the major industries which AI has penetrated today is supply chain and logistics. Experts say that by 2020, AI could be completely transforming warehouse operations, with improvements in efficiency, profits and targets. The warehouse powered by AI would become more responsive and dynamic.

How can AI help in Warehouse Optimization?

One way through which AI can optimize the warehouse is by increasing the productivity of their workforce, especially warehouses that deals with regular pick and pack operations. Another way would be to use AI to enhance the communication between different operational departments, which would in turn ensure a smooth running of day-to-day tasks. For example, online supermarket Ocado uses robots that can converse back-and-forth at a very short span of time, thus eliminating various human inaccuracies.

This would help in achieving overall targets and ensuring that the tasks are completed, while using time efficiently.

Multiple operations in the supply chain industry are expected to become fully automated by 2030. Predictable physical activities can easily be replaced by smart machines, saving time and money usually spent on wages, human mistakes, lunch breaks among various others. Robots, such as Amazon’s Kiva robots, can pick up goods and distribute them to different stations within a warehouse in mere minutes, and only needs five minutes to charge every hour.

Although 30% of jobs have the potential to become automated, employees are not expected to be fully replaced by robots. Automation will be integrated into current operations to be used as an aid; something to work alongside workers and help with routine tasks.

How is AI useful in data processing and mining?

Another area that AI can efficiently take over is the task of processing data and collecting data obtained from different warehouse operations. Complex operations can be captured and used to recognize patterns, regularities, and interdependencies from unstructured data. A smart warehouse will then be able to adapt, dynamically and independently, to new situations within the entire logistics system. Data thus collected can be analyzed to arrive at better and improved business strategies that use AI to their advantage.

To conclude

Machine learning algorithms and AI can be implemented in warehouse operations and supply chain so that they are able to anticipate situations, and solve problems efficiently. Thus, decisions are made in a short time.

AI can use the real-time insights gathered at every touch point in the warehouse’s workflow, to improve inventory accuracy and increase turns. Warehouse activities can therefore be actively monitored, while anticipating the workflow and proactively recommending optimizations.

This content is not for distribution. Any use of the content without intimation to its owner will be considered as violation.

AI IN 2019 – EXCITING TIMES AHEAD

Industry 4.0, powered by artificial intelligence and machine learning goes beyond anything dreamed up during previous technological revolutions. The future would bring forth a world where machines not only do the physical labor, but are also in charge for the ideas- planning, strategizing and making decisions. Artificial intelligence has brought forward such advancements in industries that the furor surrounding it still continues in 2019.

So what does 2019 have in store?

The year 2019 will probably witness increased emphasis on measures designed to increase the transparency of AI. Considering a case, IBM recently unveiled technology developed to improve the traceability of decisions into its AI OpenScale technology. Bernard Marr writes in Forbes that “This concept gives real-time insights into not only what decisions are being made, but how they are being made, drawing connections between data that is used, decision weighting and potential for bias in information.”

Research and business will also benefit from openness which exposes bias in data or algorithms. This would also solve the ‘black box problem’, where the workings powered by AI seem unfathomable without a thorough understanding of what it’s actually doing. Hence it will be comfortably accepted in the wider society.

The next step would be a deeper infiltration of AI and automation into every business. In 2018, companies began to get a firmer grip on the realities of what AI can and can’t do. Retailers are proficient at grabbing data through receipts and loyalty programs and feeding it into AI engines to work out how to get better at selling us things. Manufacturers use predictive technology to know when a repair is required or when a machine will wear out. In 2019, this technology would be much more trustworthy, as it would be revamped with the learnings it has picked up in its initial deployments.

Business Line says that by 2019, at least 25 percent of employees at all large corporations will communicate with a bot for information. More than half of organizations have invested in VCAs for customer service, as they realize the advantages of automated self-service and the ability to escalate to a human in complex situations

An AI assistant can also semantically understand job descriptions that is fed in and finds relevant matches for the requirement from available job portals and databases. A hiring assistant can also reach out to identified candidates and engage in a chat to pre-qualify them as per company requirements.

AI is out there, ready to be consumed by startups and corporations alike, to solve almost any problem from commuting to visualizing, replacing many mundane human tasks with efficient machines and leaving us humans to make more complex decisions. O’Reilly data says that 51 percent of surveyed organizations already use data science teams to develop AI solutions for internal purposes. Adoption of AI tools would be one the most important AI trends in 2019. Let us wait and watch how it rolls out this year.

This content is not for distribution. Any use of the content without intimation to its owner will be considered as violation.

Page 1 of 1112345...10...Last »