Deere Advances In-Field Autonomy
While stereo cameras and computer vision guide Deere’s “limited release” 8R autonomous tractor, Bear Flag’s lidar tech will augment future machines.
John Deere got “really serious” about autonomy in 2019, according to Joe Liefer, senior product manager of autonomy at John Deere Intelligent Solutions Group. Three years later – after forming an in-house development team and acquiring some tech-startup expertise – the machinery maker revealed a fully autonomous tractor at CES 2022 that it claims is ready for large-scale production.
Based on Deere’s 8R tractor, the machine combines a TruSet-enabled chisel plow, GPS guidance system, advanced AI and six pairs of stereo cameras that enable 360-degree obstacle detection and distance calculation. The autonomous 8R tractor also continuously checks its position relative to a geofence and is accurate to within less than 1 inch (25 mm), Deere claims. Farmers monitor and control it from a smartphone app.
The tractor collected over 50 million images during in-field testing over the past three years, according to Deere. Each model is trained with hundreds of thousands of images, and its neural network classifies each pixel in about 100 milliseconds, determining whether the machine continues to move or stops if an obstacle is detected. Capable of preparing more than 325 acres of soil in 24 hours, the autonomous tractor will be available to farmers later in 2022.
Liefer discussed Deere’s autonomous tractor and its ongoing development activities with Grayson Brulte, host of the SAE Tomorrow Today podcast. Highlights from that discussion follow.
How are farmers approaching autonomy?
Really, we’ve developed autonomy at the request of farmers. Farmers face incredible challenges every growing season, time being one of the biggest challenges: ‘I have to do two to three things at the same time. How do I find enough people to help me farm this land or run the equipment?’ Farmers are asking for other ways to add additional automation and then they’ve been requesting autonomy in the last several years. That’s when we got really serious about it, in 2019. How do we bring true in-field autonomy to farmers to be another tool in their toolbox. So, for the first time as we revealed at CES, we’re giving the farmer the opportunity to step out of the cab, out of that tractor seat, and then they can go do something else – a higher value-added job on the farm [or] spending more time with family, for example.
How do farmers set up and operate the autonomous tractor?
A farmer drives the tractor to the field and selects the boundary – they do some quick setup in the tractor on the command center for that job. It can take 60 seconds and then they step out of the cab and pull out their smartphone, connect to John Deere Operations Center Mobile, and from there they can start the job and that’s how they continue to monitor it, stop it and make remote adjustments. Also, they now for the first time can see live video from any of our six stereo cameras that are on this new autonomous 8R.
There are all kinds of technology that ties the John Deere Precision Ag suite together into one seamless product. But then we also add more enhancements to the current tech to give them true field context and true controllability of the equipment from a smartphone. If there’s an operational decision that needs to be made, like say something unexpected is in the field, they’re going to get a push notification, be able to open their cell phone and…give the system permission to route around an object and keep doing the job. Then they can go back later and remove that object and clean up the field.
Is there a limit for how many autonomous tractors can run at once?
Ultimately, it’s really a grower preference on how they want to size equipment and how many tractors, combines, sprayers that they want to manage in order to cover their acres. From a tech standpoint, what we revealed in 2022 is one autonomous tractor operating in the field at any given time. But John Deere already has technology where we do what we call ‘coverage map sharing’ or allowing multiple machines to work in the field at the same time. As we go forward here into 2023 and beyond, we will enable autonomy to follow that same suit where you could have multiple pieces of equipment jointly doing path planning and job optimization together in the field. And so then not only could you have multiple machines in a field, but you could have multiple autonomous machines in your fleet and in a number of fields that you’re managing.
Will Deere’s See & Spray technology be integrated into autonomous machines?
We just introduced See & Spray Ultimate [in March 2022], that’s really making the equipment superhuman…We’re going to listen to our farmers and understand where they want to take autonomy next. We started in fall tillage because that’s where farmers have a tremendous amount of labor pressure. We’re continuing to talk to farmers as [they experience] this type of technology for the first time. The conversations before they’ve actually seen our autonomous 8R and then after they’ve used it – controlled this 40,000-pound machine from their smartphone – it really gets the gears turning quickly around, ‘How can I now start to farm differently if I had this technology?’
How do you build trust in the tech as you rollout the autonomous 8R?
Part of the reason we’re approaching this technology rollout differently with autonomy is it’s not available for mass production; it’s available in a limited release. That allows us to open up that conversation publicly with growers all over not only the U.S. but really the world. This fall, we will ship units to farmers to use, and we’ll partner with the local John Deere dealers as well to get more eyes on the equipment, to gain more feedback. Then as we go into 2023 and beyond there’ll be more units available to start to really grow our geographic footprint for local demonstrations and local feedback.
Do you see more startups pursuing autonomy in agriculture?
Agriculture’s just a super exciting place to be. As John Deere and with our startups that we acquired, Blue River Technology [in 2017] and Bear Flag Robotics [in 2021], we also believe that we can lead in some of this technology in the agricultural space: in-field autonomy, as an example, versus on-highway. There are monumental challenges that we have, and the technology that is coming online today – computer vision, some of these advanced controls for hydraulics and electrification – can be put to use on a farm field essentially tomorrow. That’s what I think is starting to draw more startups to the space. It becomes, ‘We could commercialize this product today. Let’s go work in this agricultural space.’ For Deere, we’re always looking for how do we partner and work with some new innovative companies to bring that value to farmers on a global scale.
How will Bear Flags’ technology be integrated into the overall Deere autonomy stack?
We were on the same mission, so that’s exciting as we truly start to collaborate and work together. And they’ve taken a little bit different approach to autonomy. As we revealed at the Consumer Electronics Show, John Deere’s backbone is stereo cameras and computer vision to become the eyes around the equipment. Bear Flag has invested in some lidar and radar technology and become experts in that.
If the human eye cannot see it, neither can a stereo camera. As we start to go into these different jobs, you’ll see us start to integrate some tech out of Bear Flag into our autonomy stack. So, we’re going forward with developing one autonomy stack that’ll interface with Deere vehicles, but it’ll have a modular approach where we can add different sensing modalities to that autonomy stack depending on the job that we’re trying to do, or the machine form we’re bolted up to.
Can you add the components based on terrain and weather?
Agriculture all around the world is generally a dusty environment. Some jobs are dustier than others. You’ll see us focus on essentially the initial technology costs – we chose stereo cameras and that approach with our GPUs to really manage costs in the initial product. As we go into a more complex job, you’ll see us add other sensor packages that then will meet that need.
We’ve had to understand how to work around dust, and there’s going to be some conditions where the system will not be able to safely see. Early on we’ll be able to automatically detect those kind of blackout conditions and the equipment will be stopped. In addition, bugs at night, or [flocks of] birds. Our system, we’ve essentially had to train it on what bugs look like, we’ve had to train it on what birds look like because it will stop for a bird flying in front of the tractor. Those are just some of the unexpected things that we’ve had to train on and learn so we can continue to work out in the field.
Top Stories
INSIDERDefense
This Robot Dog Detects Nuclear Material and Chemical Weapons
INSIDERManned Systems
Testing the Viability of Autonomous Laser Welding in Space
INSIDERTest & Measurement
Germany's New Military Surveillance Jet Completes First Flight
NewsUnmanned Systems
The Unusual Machines Approach to Low-Cost Drones and Drone Components
INSIDERSoftware
Accelerating Climate-Compatible Aircraft Design with AI
INSIDERManufacturing & Prototyping
Webcasts
Software
Best Practices for Developing Safe and Secure Modular Software
Power
Designing an HVAC Modeling Workflow for Cabin Energy Management...
Aerospace
Countering the Evolving Challenge of Integrating UAS Into...
Manned Systems
How Pratt & Whitney Uses a Robot to Help Build Jet Engines
Manufacturing & Prototyping
Scaling Manufacturing and Production for 'Data as a Service' Electric Drone
Test & Measurement
A Quick Guide to Multi-Axis Simulation and Component Testing