国际米兰对阵科莫 - automation /taxonomy/subjects/automation en Robot 鈥榗hef鈥 learns to recreate recipes from watching food videos /research/news/robot-chef-learns-to-recreate-recipes-from-watching-food-videos <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/untitled-3_1.jpg?itok=RV53FI1P" alt="Robot arm reaching for a piece of broccoli" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>The researchers, from the 国际米兰对阵科莫, programmed their robotic chef with a 鈥榗ookbook鈥 of eight simple salad recipes. After watching a video of a human demonstrating one of the recipes, the robot was able to identify which recipe was being prepared and make it.</p>&#13; &#13; <p>In addition, the videos helped the robot incrementally add to its cookbook. At the end of the experiment, the robot came up with a ninth recipe on its own. Their <a href="https://ieeexplore.ieee.org/document/10124218">results</a>, reported in the journal <em>IEEE Access</em>, demonstrate how video content can be a valuable and rich source of data for automated food production, and could enable easier and cheaper deployment of robot chefs.</p>&#13; &#13; <p>Robotic chefs have been featured in science fiction for decades, but in reality, cooking is a challenging problem for a robot. Several commercial companies have built prototype robot chefs, although none of these are currently commercially available, and they lag well behind their human counterparts in terms of skill.</p>&#13; &#13; <p>Human cooks can learn new recipes through observation, whether that鈥檚 watching another person cook or watching a video on YouTube, but programming a robot to make a range of dishes is costly and time-consuming.</p>&#13; &#13; <p>鈥淲e wanted to see whether we could train a robot chef to learn in the same incremental way that humans can 鈥 by identifying the ingredients and how they go together in the dish,鈥 said Grzegorz Sochacki from 国际米兰对阵科莫鈥檚 Department of Engineering, the paper鈥檚 first author.</p>&#13; &#13; <p>Sochacki, a PhD candidate in Professor Fumiya Iida鈥檚 <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a>, and his colleagues devised eight simple salad recipes and filmed themselves making them. They then used a publicly available neural network to train their robot chef. The neural network had already been programmed to identify a range of different objects, including the fruits and vegetables used in the eight salad recipes (broccoli, carrot, apple, banana and orange).</p>&#13; &#13; <p>Using computer vision techniques, the robot analysed each frame of video and was able to identify the different objects and features, such as a knife and the ingredients, as well as the human demonstrator鈥檚 arms, hands and face. Both the recipes and the videos were converted to vectors and the robot performed mathematical operations on the vectors to determine the similarity between a demonstration and a vector.</p>&#13; &#13; <p>By correctly identifying the ingredients and the actions of the human chef, the robot could determine which of the recipes was being prepared. The robot could infer that if the human demonstrator was holding a knife in one hand and a carrot in the other, the carrot would then get chopped up.</p>&#13; &#13; <p>Of the 16 videos it watched, the robot recognised the correct recipe 93% of the time, even though it only detected 83% of the human chef鈥檚 actions. The robot was also able to detect that slight variations in a recipe, such as making a double portion or normal human error, were variations and not a new recipe. The robot also correctly recognised the demonstration of a new, ninth salad, added it to its cookbook and made it.</p>&#13; &#13; <p>鈥淚t鈥檚 amazing how much nuance the robot was able to detect,鈥 said Sochacki. 鈥淭hese recipes aren鈥檛 complex 鈥 they鈥檙e essentially chopped fruits and vegetables, but it was really effective at recognising, for example, that two chopped apples and two chopped carrots is the same recipe as three chopped apples and three chopped carrots.鈥 聽</p>&#13; &#13; <p>The videos used to train the robot chef are not like the food videos made by some social media influencers, which are full of fast cuts and visual effects, and quickly move back and forth between the person preparing the food and the dish they鈥檙e preparing. For example, the robot would struggle to identify a carrot if the human demonstrator had their hand wrapped around it 鈥 for the robot to identify the carrot, the human demonstrator had to hold up the carrot so that the robot could see the whole vegetable.</p>&#13; &#13; <p>鈥淥ur robot isn鈥檛 interested in the sorts of food videos that go viral on social media 鈥 they鈥檙e simply too hard to follow,鈥 said Sochacki. 鈥淏ut as these robot chefs get better and faster at identifying ingredients in food videos, they might be able to use sites like YouTube to learn a whole range of recipes.鈥</p>&#13; &#13; <p>The research was supported in part by Beko plc and the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation (UKRI).</p>&#13; &#13; <p><em><strong>Reference:</strong><br />&#13; Grzegorz Sochacki et al. 鈥<a href="https://ieeexplore.ieee.org/document/10124218">Recognition of Human Chef鈥檚 Intentions for Incremental Learning of Cookbook by Robotic Salad Chef</a>.鈥 IEEE Access (2023). DOI: 10.1109/ACCESS.2023.3276234</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers have trained a robotic 鈥榗hef鈥 to watch and learn from cooking videos, and recreate the dish itself.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">We wanted to see whether we could train a robot chef to learn in the same incremental way that humans can 鈥 by identifying the ingredients and how they go together in the dish</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Greg Sochacki</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-208991" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/robot-chef-learns-to-recreate-recipes-from-watching-food-videos">Robot 鈥榗hef鈥 learns to recreate recipes from watching food videos</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-1 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/nx3k4XA3x4Q?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by-nc-sa/4.0/" rel="license"><img alt="Creative Commons License." src="/sites/www.cam.ac.uk/files/inner-images/cc-by-nc-sa-4-license.png" style="border-width: 0px; width: 88px; height: 31px;" /></a><br />&#13; The text in this work is licensed under a <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License</a>. Images, including our videos, are Copyright 漏国际米兰对阵科莫 and licensors/contributors as identified.聽 All rights reserved. We make our image and video content available in a number of ways 鈥 as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/social-media/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Mon, 05 Jun 2023 01:00:00 +0000 sc604 239811 at Taste of the future: robot chef learns to 鈥榯aste as you go鈥 /research/news/taste-of-the-future-robot-chef-learns-to-taste-as-you-go <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/chef.jpg?itok=zwU4FEoU" alt="" title="Credit: None" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>Working in collaboration with domestic appliances manufacturer Beko, researchers from the 国际米兰对阵科莫 trained their robot chef to assess the saltiness of a dish at different stages of the chewing process, imitating a similar process in humans.</p> <p>Their results could be useful in the development of automated or semi-automated food preparation by helping robots to learn what tastes good and what doesn鈥檛, making them better cooks.</p> <p>When we chew our food, we notice a change in texture and taste. For example, biting into a fresh tomato at the height of summer will release juices, and as we chew, releasing both saliva and digestive enzymes, our perception of the tomato鈥檚 flavour will change.</p> <p>The robot chef, which has already been trained to make omelettes based on human taster鈥檚 feedback, tasted nine different variations of a simple dish of scrambled eggs and tomatoes at three different stages of the chewing process, and produced 鈥榯aste maps鈥 of the different dishes.</p> <p>The researchers found that this 鈥榯aste as you go鈥 approach significantly improved the robot鈥檚 ability to quickly and accurately assess the saltiness of the dish over other electronic tasting technologies, which only test a single homogenised sample. The <a href="https://www.frontiersin.org/articles/10.3389/frobt.2022.886074/abstract">results</a> are reported in the journal <em>Frontiers in Robotics &amp; AI</em>.</p> <p>The perception of taste is a complex process in humans that has evolved over millions of years: the appearance, smell, texture and temperature of food all affect how we perceive taste; the saliva produced during chewing helps carry chemical compounds in food to taste receptors mostly on the tongue; and the signals from taste receptors are passed to the brain. Once our brains are aware of the flavour, we decide whether we enjoy the food or not.</p> <p>Taste is also highly individual: some people love spicy food, while others have a sweet tooth. A good cook, whether amateur or professional, relies on their sense of taste, and can balance the various flavours within a dish to make a well-rounded final product.</p> <p>鈥淢ost home cooks will be familiar with the concept of tasting as you go 鈥 checking a dish throughout the cooking process to check whether the balance of flavours is right,鈥 said Grzegorz Sochacki from 国际米兰对阵科莫鈥檚 Department of Engineering, the paper鈥檚 first author. 鈥淚f robots are to be used for certain aspects of food preparation, it鈥檚 important that they are able to 鈥榯aste鈥 what they鈥檙e cooking.鈥</p> <p>鈥淲hen we taste, the process of chewing also provides continuous feedback to our brains,鈥 said co-author Dr Arsen Abdulali, also from the Department of Engineering. 鈥淐urrent methods of electronic testing only take a single snapshot from a homogenised sample, so we wanted to replicate a more realistic process of chewing and tasting in a robotic system, which should result in a tastier end product.鈥</p> <p>The researchers are members of 国际米兰对阵科莫鈥檚 <a href="https://birlab.org/">Bio-Inspired Robotics Laboratory</a> run by <a href="http://mi.eng.cam.ac.uk/Main/FI224">Professor Fumiya Iida</a> of the Department of Engineering, which focuses on training robots to carry out the so-called last metre problems which humans find easy, but robots find difficult. Cooking is one of these tasks: earlier tests with their robot 鈥榗hef鈥 have produced a passable omelette using feedback from human tasters.</p> <p>鈥淲e needed something cheap, small and fast to add to our robot so it could do the tasting: it needed to be cheap enough to use in a kitchen, small enough for a robot, and fast enough to use while cooking,鈥 said Sochacki.</p> <p>To imitate the human process of chewing and tasting in their robot chef, the researchers attached a conductance probe, which acts as a salinity sensor, to a robot arm. They prepared scrambled eggs and tomatoes, varying the number of tomatoes and the amount of salt in each dish.</p> <p>Using the probe, the robot 鈥榯asted鈥 the dishes in a grid-like fashion, returning a reading in just a few seconds.</p> <p>To imitate the change in texture caused by chewing, the team then put the egg mixture in a blender and had the robot test the dish again. The different readings at different points of 鈥榗hewing鈥 produced taste maps of each dish.</p> <p>Their results showed a significant improvement in the ability of robots to assess saltiness over other electronic tasting methods, which are often time-consuming and only provide a single reading.</p> <p>While their technique is a proof of concept, the researchers say that by imitating the human processes of chewing and tasting, robots will eventually be able to produce food that humans will enjoy and could be tweaked according to individual tastes.</p> <p>鈥淲hen a robot is learning how to cook, like any other cook, it needs indications of how well it did,鈥 said Abdulali. 鈥淲e want the robots to understand the concept of taste, which will make them better cooks. In our experiment, the robot can 鈥榮ee鈥 the difference in the food as it鈥檚 chewed, which improves its ability to taste.鈥</p> <p>鈥淏eko has a vision to bring robots to the home environment which are safe and easy to use,鈥 said Dr Muhammad W. Chughtai, Senior Scientist at Beko plc. 鈥淲e believe that the development of robotic chefs will play a major role in busy households and assisted living homes in the future. This result is a leap forward in robotic cooking, and by using machine and deep learning algorithms, mastication will help robot chefs adjust taste for different dishes and users.鈥</p> <p>In future, the researchers are looking to improve the robot chef so it can taste different types of food and improve sensing capabilities so it can taste sweet or oily food, for example.</p> <p>The research was supported in part by Beko plc and the Engineering and Physical Sciences Research Council (EPSRC) Centre of Doctoral Training on Agri-Food Robotics (Agriforwards CDT). EPSRC is part of UK Research and Innovation (UKRI).聽Fumiya Iida is a Fellow of Corpus Christi College, 国际米兰对阵科莫.</p> <p>聽</p> <p><em><strong>Reference:</strong><br /> Grzegorz Sochacki, Arsen Abdulali, and Fumiya Iida. 鈥<a href="https://www.frontiersin.org/articles/10.3389/frobt.2022.886074/abstract">Mastication-Enhanced Taste-Based Classification of Multi-Ingredient Dishes for Robotic Cooking</a>.鈥 Frontiers in Robotics &amp; AI (2022). DOI: 10.3389/frobt.2022.886074</em></p> </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>A robot 鈥榗hef鈥 has been trained to taste food at different stages of the chewing process to assess whether it鈥檚 sufficiently seasoned.</p> </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">If robots are to be used for certain aspects of food preparation, it鈥檚 important that they are able to 鈥榯aste鈥 what they鈥檙e cooking</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Grzegorz Sochacki</div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-194681" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/taste-of-the-future-robot-chef-learns-to-taste-as-you-go">Taste of the future: robot chef learns to 鈥榯aste as you go鈥</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-2 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/nZ0xviqzUJg?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br /> The text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright 漏国际米兰对阵科莫 and licensors/contributors as identified.聽 All rights reserved. We make our image and video content available in a number of ways 鈥 as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p> </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 04 May 2022 04:00:00 +0000 sc604 231861 at Machine learning to help develop self-healing robots that 鈥榝eel pain鈥 /research/news/machine-learning-to-help-develop-self-healing-robots-that-feel-pain <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/news/sphgreen.jpg?itok=KhUygY6c" alt="Robotic hand made of self-healing material that can heal at room temperature" title="Robotic hand made of self-healing material that can heal at room temperature, Credit: Bram Vanderborght" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>The goal of the 鈧3 million Self-healing soft robot (SHERO) project, funded by the European Commission, is to create a next-generation robot made from self-healing materials (flexible plastics) that can detect damage, take the necessary steps to temporarily heal itself and then resume its work 鈥 all without the need for human interaction.</p>&#13; &#13; <p>Led by the University of Brussels (VUB), the research consortium includes the Department of Engineering (国际米兰对阵科莫), 脡cole Sup茅rieure de Physique et de Chimie Industrielles de la ville de Paris (ESPCI), Swiss Federal Laboratories for Materials Science and Technology (Empa), and the Dutch Polymer manufacturer SupraPolix.</p>&#13; &#13; <p>As part of the SHERO project, the 国际米兰对阵科莫 team, led by <a href="https://www.eng.cam.ac.uk/profiles/fi224">Dr Fumiya Iida</a>聽from the Department of Engineering are looking at integrating self-healing materials into soft robotic arms.</p>&#13; &#13; <p><a href="https://www.eng.cam.ac.uk/profiles/tg444">Dr Thomas George Thuruthel</a>, also from the Department of Engineering,聽said self-healing materials could have future applications in modular robotics, educational robotics and evolutionary robotics where a single robot can be 'recycled' to generate a fresh prototype.</p>&#13; &#13; <p>鈥淲e will be using machine learning to work on the modelling and integration of these self-healing materials, to include self-healing actuators and sensors, damage detection, localisation and controlled healing,鈥 he said. 鈥淭he adaptation of models after the loss of sensory data and during the healing process is another area we are looking to address. The end goal is to integrate the self-healing sensors and actuators into demonstration platforms in order to perform specific tasks.鈥</p>&#13; &#13; <p>Professor Bram Vanderborght, from VUB, who is leading the project with scientists from the robotics research centre Brubotics and the polymer research lab FYSC, said: 鈥淲e are obviously very pleased to be working on the next generation of robots. Over the past few years, we have already taken the first steps in creating self-healing materials for robots. With this research we want to continue and, above all, ensure that robots that are used in our working environment are safer, but also more sustainable. Due to the self-repair mechanism of this new kind of robot, complex, costly repairs may be a thing of the past.鈥</p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers from the 国际米兰对阵科莫 will use self-healing materials and machine learning to develop soft robotics as part of a new collaborative project.</p>&#13; </p></div></div></div><div class="field field-name-field-media field-type-file field-label-hidden"><div class="field-items"><div class="field-item even"><div id="file-150492" class="file file-video file-video-youtube"> <h2 class="element-invisible"><a href="/file/150492">Self-healing robots that 鈥榝eel pain鈥</a></h2> <div class="content"> <div class="cam-video-container media-youtube-video media-youtube-3 "> <iframe class="media-youtube-player" src="https://www.youtube-nocookie.com/embed/R7fZbYUFtc8?wmode=opaque&controls=1&rel=0&autohide=0" frameborder="0" allowfullscreen></iframe> </div> </div> </div> </div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="/" target="_blank">Bram Vanderborght</a></div></div></div><div class="field field-name-field-image-desctiprion field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Robotic hand made of self-healing material that can heal at room temperature</div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; The text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright 漏国际米兰对阵科莫 and licensors/contributors as identified.聽 All rights reserved. We make our image and video content available in a number of ways 鈥 as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Wed, 07 Aug 2019 09:11:57 +0000 Anonymous 206972 at One day of paid work a week is all we need to get mental health benefits of employment /stories/employment-dosage <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Latest research finds up to eight hours of paid work a week significantly boosts mental health and life satisfaction. However, researchers found little evidence that any more hours 鈥 including a full five-day week 鈥 provide further increases in wellbeing.聽</p> </p></div></div></div> Wed, 19 Jun 2019 09:23:35 +0000 fpjl2 206022 at The 拢2 billion vegetable and the agricultural future of the East /stories/the-two-billion-vegetable <div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>From crop science to robotics, supply chains to economics, 国际米兰对阵科莫 University researchers are working with farmers and industry to sustainably increase agricultural productivity and profitability.聽</p> </p></div></div></div> Fri, 15 Mar 2019 11:00:01 +0000 lw355 204062 at Humans need not apply /research/features/humans-need-not-apply <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/features/16.humansneednotapply.jpg?itok=LMfjQlXw" alt="" title="Credit: The District" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>On googling 鈥榳ill a robot take my job?鈥 I find myself on a BBC webpage that invites me to discover the likelihood that my work will be automated in the next 20 years. I type in 鈥榚ditor鈥. 鈥淚t鈥檚 quite unlikely, 8%鈥 comes back. Quite reassuring 鈥 but, coming from a farming family, it鈥檚 a sobering moment when I type in 鈥榝armer鈥: 鈥淚t鈥檚 fairly likely, 76%鈥.</p>&#13; &#13; <p>The results may well be out of date 鈥 such is the swiftness of change in labour market predictions 鈥 but the fact that the webpage even exists says something about the focus of many of today鈥檚 conversations around the future of work.</p>&#13; &#13; <p>Many of the discussions are driven by stark numbers. According to a scenario suggested recently by consultancy McKinsey, 75鈥375 million workers (3鈥14% of the global workforce) will need to switch occupational categories by 2030, and all workers will need to adapt 鈥渁s their occupations evolve alongside increasingly capable machines鈥.</p>&#13; &#13; <p>Just recently, online retailer Shop Direct announced the closure of warehouses and a move to automation, putting nearly 2,000 jobs at risk. Automation 鈥 or 鈥榚mbodied鈥 artificial intelligence (AI) 鈥 is one aspect of the disruptive effects of technology on the labour market. 鈥楧isembodied AI鈥, like the algorithms running in our smartphones, is another.</p>&#13; &#13; <p>Dr Stella Pachidi from 国际米兰对阵科莫 Judge Business School believes that some of the most fundamental changes in work are happening as a result of 鈥榓lgorithmication鈥 of jobs that are dependent on information rather than production 鈥 the so-called knowledge economy.</p>&#13; &#13; <p>Algorithms are capable of learning from data to undertake tasks that previously needed human judgement, such as reading legal contracts, analysing medical scans and gathering market intelligence.</p>&#13; &#13; <p>鈥淚n many cases, they can outperform humans,鈥 says Pachidi. 鈥淥rganisations are attracted to using algorithms because they want to make choices based on what they consider is 鈥榩erfect information鈥, as well as to reduce costs and enhance productivity.鈥</p>&#13; &#13; <p><img alt="" src="/sites/www.cam.ac.uk/files/inner-images/cover_1.jpg" style="width: 200px; height: 278px; float: right;" /></p>&#13; &#13; <p>But these enhancements are not without consequences, says Pachidi, who has recently started to look at the impact of AI on the legal profession.</p>&#13; &#13; <p>鈥淚f routine cognitive tasks are taken over by AI, how do professions develop their future experts?鈥 she asks. 鈥淓xpertise and the authority it gives you is distributed in the workplace. One way of learning about a job is 鈥榣egitimate peripheral participation鈥 鈥 a novice stands next to experts and learns by observation. If this isn鈥檛 happening, then you need to find new ways to learn.鈥</p>&#13; &#13; <p>Another issue is the extent to which the technology influences or even controls the workforce. For over two years, Pachidi was embedded in a telecommunications company. There she observed 鈥渟mall battles鈥 playing out that could have vast consequences for the future of the company.</p>&#13; &#13; <p>鈥淭he way telecoms salespeople work is through personal and frequent contact with clients, using the benefit of experience to assess a situation and reach a decision. However, the company had started using a data analytics algorithm that defined when account managers should contact certain customers about which kinds of campaigns and what to offer them.鈥</p>&#13; &#13; <p>The algorithm 鈥 usually built by external designers 鈥 often becomes the curator of knowledge, she explains. 鈥淚n cases like this, a myopic view begins to creep into working practices whereby workers learn through the 鈥榓lgorithm鈥檚 eyes鈥 and become dependent on its instructions. Alternative explorations 鈥 the so-called technology of foolishness where innovation comes out of experimentation and intuition 鈥 is effectively discouraged.鈥</p>&#13; &#13; <p>Pachidi and colleagues have even observed the development of strategies to 鈥榞ame鈥 the algorithm. 鈥淒ecisions made by algorithms can structure and control the work of employees. We are seeing cases where workers feed the algorithm with false data to reach their targets.鈥</p>&#13; &#13; <p>It鈥檚 scenarios like these that many researchers in 国际米兰对阵科莫 and beyond are working to avoid by increasing the trustworthiness and transparency of AI technologies (see <a href="/system/files/issue_35_research_horizons_new.pdf">issue 35 of <em>Research Horizons</em></a>), so that organisations and individuals understand how AI decisions are made.</p>&#13; &#13; <p>In the meantime, says Pachidi, in our race to reap the undoubted benefits of new technology, it鈥檚 important to avoid taking a laissez-faire approach to algorithmication: 鈥淲e need to make sure we fully understand the dilemmas that this new world raises regarding expertise, occupational boundaries and control.鈥</p>&#13; &#13; <p>While Pachidi sees changes ahead in the nature of work, economist Professor Hamish Low believes that the future of work will involve major transitions across the whole life course for everyone: 鈥淭he traditional trajectory of full-time education followed by full-time work followed by a pensioned retirement is a thing of the past.鈥</p>&#13; &#13; <p>鈥淒isruptive technologies, the rise of the ad hoc 鈥榞ig economy鈥, living longer and the fragile economics of pension provision will mean a multistage employment life: one where retraining happens across the life course, and where multiple jobs and no job happen by choice at different stages.鈥</p>&#13; &#13; <p>His research examines the role of risk and the welfare system in relation to work at these various life stages. 鈥淲hen we are talking about the future of work,鈥 he says, 鈥渨e should have in mind these new frameworks for what people鈥檚 lives will look like, and prepare new generations for a different perspective on employment.鈥</p>&#13; &#13; <p>On the subject of future job loss, he believes the rhetoric is based on a fallacy: 鈥淚t assumes that the number of jobs is fixed. If in 30 years, half of 100 jobs are being carried out by robots that doesn鈥檛 mean we are left with just 50 jobs for humans. The number of jobs will increase: we would expect there to be 150 jobs.鈥</p>&#13; &#13; <p>Dr Ewan McGaughey, at 国际米兰对阵科莫鈥檚 Centre for Business Research and King鈥檚 College London, agrees that 鈥渁pocalyptic鈥 views about the future of work are misguided. 鈥淚t鈥檚 the laws that restrict the supply of capital to the job market, not the advent of new technologies that causes unemployment.鈥</p>&#13; &#13; <p>His recently published research answers the question of whether automation, AI and robotics will mean a 鈥榡obless future鈥 by looking at the causes of unemployment. 鈥淗istory is clear that change can mean redundancies 鈥 after World War II, 42% of UK jobs were redundant, but social policy maintained full employment. Yes, technology can displace people. But social policies can tackle this through retraining and redeployment.鈥</p>&#13; &#13; <p>He adds: 鈥淭he big problem won鈥檛 be unemployment it will be underemployment 鈥 people who want to work but can鈥檛 because they have zero-hours contracts. If there is going to be change to jobs as a result of AI and robotics then I鈥檇 like to see governments seizing the opportunity to improve policy to enforce good job security. We can 鈥榬eprogramme鈥 the law to prepare for a fairer future of work and leisure.鈥</p>&#13; &#13; <p>This might mean revitalising fiscal and monetary policies such as a universal social security and taxing the owners of robots.</p>&#13; &#13; <p>McGaughey鈥檚 findings are a call to arms to leaders of organisations, governments and banks to pre-empt the coming changes with bold new policies that ensure full employment, fair incomes and a thriving economic democracy.</p>&#13; &#13; <p>鈥淭he promises of these new technologies are astounding. They deliver humankind the capacity to live in a way that nobody could have once imagined,鈥 he adds. 鈥淛ust as the industrial revolution brought people past subsistence agriculture, and the corporate revolution enabled mass production, a third revolution has been pronounced. But it will not only be one of technology. The next revolution will be social.鈥</p>&#13; &#13; <p><em>Inset image: read more about our research on the topic of work in the University's research magazine;聽download聽a聽<a href="/system/files/issue_36_research_horizons.pdf">pdf</a>;聽view聽on聽<a href="https://issuu.com/uni_cambridge/docs/issue_36_research_horizons">Issuu</a>.</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Will automation, AI and robotics mean a jobless future, or will their productivity free us to innovate and explore? Is the impact of new technologies to be feared, or a chance to rethink the structure of our working lives and ensure a fairer future for all?</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">If routine cognitive tasks are taken over by AI, how do professions develop their future experts?</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Stella Pachidi</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.thedistrict.co.uk/" target="_blank">The District</a></div></div></div><div class="field field-name-field-panel-title field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Linking research to policy makers</div></div></div><div class="field field-name-field-panel-body field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p>Dr Koen Jonkers is at the <a href="https://commission.europa.eu/about/departments-and-executive-agencies/joint-research-centre_en">Joint Research Centre</a> 鈥 the European Commission鈥檚 science and knowledge service in Brussels 鈥 and also a policy fellow at 国际米兰对阵科莫鈥檚 <a href="https://www.csap.cam.ac.uk/">Centre for Science and Policy</a> (CSaP).</p>&#13; &#13; <p>Over the past few months, Jonkers has been discussing the future of work with academic experts in 国际米兰对阵科莫 as part of his research for a special JRC report aimed at providing evidence for the European Commission鈥檚 employment and social affairs policies.</p>&#13; &#13; <p>鈥淎mong the megatrends that will affect the future of work 鈥 an ageing workforce, migration, globalisation, urbanisation, and so on 鈥 the impact of technology is one where we seem to be witnessing a step change in the relationship that many people have with their work,鈥 says Jonkers, who is one of the scientists employed by the JRC to provide independent scientific advice and support to EU policy.</p>&#13; &#13; <p>鈥淪ome people have said there will be a major shock in terms of joblessness. Others that it is part of a trend that is ongoing and that it will bring opportunity. We want to give an overview of all the viewpoints, to analyse how well societies are equipped to deal with change, to mitigate potential adverse consequences, and to come up with an idea of what is likely to happen.</p>&#13; &#13; <p>鈥淎s well as reskilling and upskilling current workers, governments will be keen to look at anticipatory actions to prepare young people to have a different type of work life to that of their parents and grandparents, so that they will be used to a world where people and machines work together.鈥</p>&#13; &#13; <p>The mission of CSaP is to improve public policy 鈥 in the UK and Europe 鈥 through the more effective use of evidence and expertise. 鈥淭hrough the CSaP Fellowship, it鈥檚 been very refreshing to talk with people with very high levels of expertise in fields other to my own,鈥 says Junkers. 鈥淚n such a multifaceted areas as the future of work, it鈥檚 been important for me to have expert analysis of the themes that are playing out.鈥</p>&#13; </div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; The text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright 漏国际米兰对阵科莫 and licensors/contributors as identified.聽 All rights reserved. We make our image and video content available in a number of ways 鈥 as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Thu, 05 Jul 2018 13:08:49 +0000 lw355 198632 at All in a day鈥檚 work /research/discussion/all-in-a-days-work <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/discussion/christopher-burns-368617-unsplash_0.jpg?itok=BAc_9TJj" alt="" title="Credit: Christopher Burns on Unsplash" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="/stories/all-in-a-days-work">READ THE STORY HERE</a></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>Researchers at the 国际米兰对阵科莫 are helping to understand the world of work 鈥 the good, the bad, the fair and the future.</p>&#13; </p></div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://unsplash.com/photos/person-holding-tool-during-daytime-8KfCR12oeUM" target="_blank">Christopher Burns on Unsplash</a></div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="http://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; The text in this work is licensed under a <a href="http://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</a>. Images, including our videos, are Copyright 漏国际米兰对阵科莫 and licensors/contributors as identified.聽 All rights reserved. We make our image and video content available in a number of ways 鈥 as here, on our <a href="/">main website</a> under its <a href="/about-this-site/terms-and-conditions">Terms and conditions</a>, and on a <a href="/about-this-site/connect-with-us">range of channels including social media</a> that permit your use and sharing of our content under their respective Terms.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div> Tue, 12 Jun 2018 08:54:24 +0000 lw355 198002 at Let鈥檚 get statted /research/features/lets-get-statted <div class="field field-name-field-news-image field-type-image field-label-hidden"><div class="field-items"><div class="field-item even"><img class="cam-scale-with-grid" src="/sites/default/files/styles/content-580x288/public/news/research/features/150603automatic-statitician.jpg?itok=Xd_7Dc2L" alt="" title="Credit: Automatic Statistician" /></div></div></div><div class="field field-name-body field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p>鈥淚 keep saying that the sexy job in the next 10 years will be statisticians, and I鈥檓 not kidding,鈥 Hal Varian, Chief Economist at Google famously observed in 2009. It seems a difficult assertion to take seriously, but six years on, there is little question that their skills are at a premium.</p>&#13; &#13; <p>Indeed, we may need statisticians now more than at any time in our history. Even compared with a decade ago, we can now gather, produce and consume unimaginably large quantities of information. As Varian predicted, statisticians who can crunch these numbers are all the rage. A new discipline, 鈥楧ata Science鈥, which fuses statistics and computational work, has emerged.</p>&#13; &#13; <p>鈥淧eople are awash in data,鈥 reflects Zoubin Ghahramani, Professor of Information Engineering at 国际米兰对阵科莫. 鈥淭his is occurring across industry, it鈥檚 changing society as we become more digitally connected, and it鈥檚 true of the sciences as well, where fields like biology and astronomy generate vast amounts of data.鈥</p>&#13; &#13; <p>Over the past few years, Richard Samworth, Professor of Statistics, has watched the datarati step out from the shadows. 鈥淚t鈥檚 probably fair to say that statistics didn鈥檛 have the world鈥檚 best PR for quite a long time,鈥 he says. 鈥淪ince this explosion in the amount of data that we can collect and store, opportunities have arisen to answer questions we previously had no hope of being able to address. These demand an awful lot of new statistical techniques.鈥</p>&#13; &#13; <p>鈥楤ig data鈥 is most obviously relevant to the sciences, where large volumes of information are gathered to answer questions in fields such as genetics, astronomy and particle physics, but it also has more familiar applications. Transport authorities gather data from electronic ticketing systems like Oyster cards to understand more about passenger movements; supermarkets closely monitor customer transactions to react to shoppers鈥 predilections. As users of social media, many of us disclose data about ourselves that is as valuable to marketing as it is relevant to psychoanalytics. Increasingly, we are also 鈥榣ifeloggers鈥, monitoring our own behaviour, health, diet and fitness, through smart technology.</p>&#13; &#13; <p>This information, as Ghahramani points out, is no use on its own: 鈥淚t fills hard drives, but to extract value from it, we need methods that learn patterns in the data and allow us to make predictions and intelligent decisions.鈥 This is what statisticians, computer scientists and machine learning specialists bring to the party 鈥 they build algorithms, which are coded as computer software, to see patterns. At root, the datarati are interpreters.</p>&#13; &#13; <p>Despite their 鈥榮exy鈥 new image, however, not enough data scientists exist to meet this rocketing demand. Could some aspects of the interpretation be automated using artificial intelligence instead, Ghahramani wondered? And so, in 2014 and with funding from Google, the first incarnation of The Automatic Statistician was launched online. Despite minimal publicity, 3,000 users uploaded datasets to it within a few months.</p>&#13; &#13; <p><img alt="" src="/sites/www.cam.ac.uk/files/inner-images/150603_zoubin-and-richard.jpg" style="width: 590px; height: 260px; float: left;" /></p>&#13; &#13; <p>Once fed a dataset, the Automatic Statistician assesses it against various statistical models, interprets the data and 鈥 uniquely 鈥 translates this interpretation into a short report of readable English. It does this without human intervention, drawing on an open-ended 鈥榞rammar鈥 of statistical models. It is also deliberately conservative, only basing its assessments on sound statistical methodology, and even critiquing its own approach.</p>&#13; &#13; <p>Ghahramani and his team are now refining the system to cope with the messy, incomplete nature of real-world data, and also plan to develop its base of knowledge and to offer interactive reports. In the longer term, they hope that the Automatic Statistician will learn from its own work: 鈥淭he idea is that it will look at a new dataset and say, 鈥楢h, I鈥檝e seen this kind of thing before, so maybe I should check the model I used last time鈥,鈥 he explains.</p>&#13; &#13; <p>While automated systems rely on existing models, new algorithms are needed to extract useful information from evolving and expanding datasets. Here, the role of human statisticians is vital.</p>&#13; &#13; <p>To characterise the problem, Samworth presents a then-and-now comparison. During the past century, a typical statistical problem might, for instance, have been to understand the relationship between the initial speed and stopping distance of cars based on a sample size of 50.</p>&#13; &#13; <p>These days, however, we can record information on a huge number of variables at once 鈥 the weather, road surface, make of car, wind direction, and so on. Although the extra information has the potential to yield better models and reduce uncertainty, in many areas, the number of features measured is so high it may even exceed the number of observations. Identifying appropriate models in this context is a serious challenge, which requires the development of new algorithms.</p>&#13; &#13; <p>To resolve this, statisticians rely on a principle called 鈥榮parsity鈥; the idea that only a few bits of the dataset are really important. The statistician identifies these needles in the haystack. Various algorithms have been developed to select the important variables, so that the initial sprawl of information starts to become manageable and patterns can be extracted.</p>&#13; &#13; <p>Together with his colleague Dr Rajen Shah in the Department of Pure Mathematics and Mathematical Statistics, Samworth has developed a method for refining any such variable selection technique called 鈥楥omplementary Pairs Stability Selection鈥. This applies the original method to random subsamples of the data instead of the whole, and does this over and over again. Eventually, the variables that appear on a high proportion of the subsamples emerge as those meriting further attention.</p>&#13; &#13; <p>Scanning Google Scholar for citations of the paper in which this was proposed, Samworth finds that his algorithm has been used in numerous research projects. One looks at how to improve fundraising for disaster zones, another examines potential biomarkers for breast cancer survival, and a third identifies risk factors connected with childhood malnutrition.</p>&#13; &#13; <p>How does he feel when he sees his work being applied so far and wide? 鈥淚t鈥檚 funny,鈥 he says. 鈥淢y training is in mathematics and I still get a kick from proving a theorem, but it鈥檚 also rewarding to see people using your work. It鈥檚 often said that the good thing about being a statistician is that you get to play in everyone鈥檚 back yard. I suppose this demonstrates why that鈥檚 true.鈥</p>&#13; &#13; <p><em>Inset image: left to right, Zoubin Ghahramani and Richard Samworth</em></p>&#13; </div></div></div><div class="field field-name-field-content-summary field-type-text-with-summary field-label-hidden"><div class="field-items"><div class="field-item even"><p><p>With more information than ever at our fingertips, statisticians are vital to innumerable fields and industries. Welcome to the world of the datarati, where humans and machines team up to crunch the numbers.</p>&#13; </p></div></div></div><div class="field field-name-field-content-quote field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even">It fills hard drives, but to extract value from it, we need methods that learn patterns in the data and allow us to make predictions and intelligent decisions</div></div></div><div class="field field-name-field-content-quote-name field-type-text field-label-hidden"><div class="field-items"><div class="field-item even">Zoubin Ghahramani</div></div></div><div class="field field-name-field-image-credit field-type-link-field field-label-hidden"><div class="field-items"><div class="field-item even"><a href="https://www.automaticstatistician.com/" target="_blank">Automatic Statistician</a></div></div></div><div class="field field-name-field-cc-attribute-text field-type-text-long field-label-hidden"><div class="field-items"><div class="field-item even"><p><a href="https://creativecommons.org/licenses/by/4.0/" rel="license"><img alt="Creative Commons License" src="https://i.creativecommons.org/l/by/4.0/88x31.png" style="border-width:0" /></a><br />&#13; The text in this work is licensed under a <a href="https://creativecommons.org/licenses/by/4.0/" rel="license">Creative Commons Attribution 4.0 International License</a>. For image use please see separate credits above.</p>&#13; </div></div></div><div class="field field-name-field-show-cc-text field-type-list-boolean field-label-hidden"><div class="field-items"><div class="field-item even">Yes</div></div></div><div class="field field-name-field-related-links field-type-link-field field-label-above"><div class="field-label">Related Links:&nbsp;</div><div class="field-items"><div class="field-item even"><a href="https://www.automaticstatistician.com/">Automatic Statistician</a></div></div></div> Wed, 03 Jun 2015 14:13:27 +0000 tdk25 152612 at