AUTONOMOUS SYSTEMS DESIGN, TESTING, AND

2y ago
5 Views
2 Downloads
360.27 KB
13 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Camille Dion
Transcription

AUTONOMOUS SYSTEMS DESIGN, TESTING, AND DEPLOYMENT:LESSONS LEARNED FROM THE DEPLOYMENT OF AN AUTONOMOUSSHUTTLE BUSLance Sherry, John Shortle, George Donohue, Brett Berlin, Jonathan WestCenter for Air Transportation Systems Research at George Mason University, Fairfax, VirginiaAbstractAdvances in technology have enabled thedeployment of unprecedented levels of automationthat verge on completely autonomous systems suchas unmanned passenger and cargo vehicles, and ns, navigation and surveillance(ICNS) systems.One application of the new technologies is inautonomous shuttle buses. This paper describes ananalysis of a collision between an autonomousshuttle bus and delivery tractor-trailer on an urbanstreet in Las Vegas. The analysis provides lessonslearned for the design, testing, and fielding of tes the difficulty in designing for all the“corner-cases” for safe fielding of an autonomoussystem. Second, the analysis shows the difficulty indemonstrating safety compliance to a target level ofsafety for systems developed using machinelearning that cannot be tested using traditionaltesting methods (e.g. code-inspection or forms ofinput-output testing. Third, the analysis identifiesthe need for the explicit, intentional design, not anafterthought, of the task of the "safety driver.”Solutions to these three issues are discussed.IntroductionAdvances in technology have enabled thedeployment of unprecedented levels of automationapproaching near-autonomous systems such asunmanned passenger and cargo vehicles, and airtraffic control.One example of an application of autonomoussystems is in urban, fixed route, shuttle buses.Traditional guidance and control systems(G&CS), used in many types of transport vehicles,command a vehicle to follow a pre-defined pathusing state information derived from sensors suchas Global Positioning Systems (GPS), InertialNavigation Systems (INS), and/or radio navigationaides [1], [2]. These designs are sufficient forvehicles that operate in a near-sterile environmentin which progress on the pre-defined path can bemanaged by basic sensors on the vehicle.To operate in a non-sterile environment, suchas an urban street, the vehicle needs informationabout the environment such as obstacles, traffic,and flow instructions (e.g. traffic lights). In oneimplementation, a complex, ubiquitous surveillanceand communication infrastructure is deployed. cation to the vehicle on the current andfuture state of the environment.In an alternate, “autonomous system”implementation, the vehicle is not dependent on anexternal, infrastructure derived communication,navigation and surveillance infrastructure. In thisimplementation, the vehicle can derive thenecessary information through its own suite ofsensors. In this way, the vehicle can be “droppedin” to the existing environment and is capable ofoperating autonomously.Recent advances in new reliable, inexpensivesensors, such as cameras, LIDAR, and radar, alongwith advances in Machine Learning, make itfeasible to fuse multiple sources of sensor data anddevelop algorithms to commands complex vehicleguidance and control behavior. With this capability,vehicle guidance and control systems (G&CS) cannow detect obstacles, traffic, and traffic flowinstructions on their own, and make decisions onhow to respond to complex, non-sterileenvironments.Traditional vehicle G&CS for operations insterile environments exhibit finite and relativelylow complexity. As a consequence, the G&CS canbe coded using a combination of rule-based andcontinuous closed-loop functions [3]. The intended

behavior can be specified a-priori and testedcompletely (i.e. every combination of inputs has anoutput), and comprehensively (i.e. every output isdetermined by a valid combination of inputs).Established methods for verification testing,validation testing, and demonstrating safetycompliance are applied and used for regulatoryapproval. These systems are certified to 10-5, 10-7,or 10-9 target levels of safety [4].Due to the complexities in recognizing nonuniform objects and situations in the non-sterileenvironment, and in responding to the exponentiallylarge number of combinations of events, vehicleG&CS for non-sterile environments are “coded”using Machine Learning supervised-trainingtechniques (instead of the traditional rule-basedsoftware algorithms). In supervised-training, theMachine Learning “rides along” with a humanoperator recording their response to every situationexperienced by the vehicle. After experiencing theresponse to a specific situation enough times, theMachine Learning algorithm can “learn” the esponsefromothersimilarsituations/responses.To deploy safe and secure autonomous G&CSfor a shuttle bus, an appropriate response to all thepossible emerging scenarios that can occur on anurban street must be encoded into the vehicleG&CS. In a dynamic, complex, and ever-changingdomain such as urban streets, there can be hundredsof thousands of static and emerging situations thatcan occur. Before being deployed for revenueservice operations, the vehicle G&CS must betested to demonstrate that all of the possibleemerging scenarios that can occur in the field donot result in a hazardous outcome [5], [6].Modern engineering best practices attempt todesign vehicle G&CS to address all the possiblescenarios that can occur in the real-world. Due tothe complexity of the domain, and thecombinatorics, it is possible to deliver a G&CS thatdoes not cover unusual situations, known as“corner-cases.” One way to uncover missingcorner-cases, and to demonstrate safety, is to accruetens of thousands of hours of operation with an“attendant” or “safety driver,” monitoring thevehicle G&CS with the responsibility to interveneshould a situation emerge with a potentiallyhazardous outcome [7].Monitoring and intervention for hazardous rareevents is a complex human-machine interactionprocess that human operators are not well suited toperform without careful design of procedures andthe associated vehicle G&CS user-interface [8] [9].This paper describes an analysis of a collisionbetween an autonomous shuttle bus and deliverytractor-trailer on an urban street in Las Vegas toprovide lessons learned for the design and testing offuture autonomous systems [10].The analysis identified three main lessonslearned for the design, testing and fielding ofautonomous systems.First, the design of the guidance and controlsystem must anticipate and handle all the possiblereal-world situations that can occur. Even on asimple shuttle bus loop route, an unusual situationpresented itself 7 minutes after the start ofdeployment, that was nor properly coded in theautonomous control systems. This emphasizes thedifficulty in designing for all the “corner-cases” forsafe fielding of an autonomous system andemphasizes the need for alternate methods forenhancing the scenarios that the vehicle is exposedto in the supervised learning phase of the G&CSdevelopment.Second, the use of Machine Learning in thedesign of the G&CS results in “black-box”automaton that prohibits the utilization oftraditional testing methods, approved by ements. As a consequence, the MachineLearning G&CS cannot undergo a code-inspectionor any form of input-output testing as the inputoutput relationship is “hidden” in the machinelearning code algorithm.An alternative means of compliance is viaperformance/risk-based testing in which the systemperformance is recorded during demonstrationtesting and used as evidence for safety assurance.This approach is problematic. The number of milesthat is required to demonstrate safety may only becompleted after several years, possibly after thetechnology is already obsolete. For example, if theUS driving fatality rate is 1E-8 per mile, thenautonomous vehicles need to be 100 times safer (i.e.

1E-10 per mile). The statistical "rule of 3" says thatif N data points are observed with zero fatalities,then the 95% upper bound on the fatality estimate is3/N. So that would require 3E-10 miles with zerofatalities to demonstrate 1E-10 per mile target levelof safety. If a manufacturer has 1E-7 miles driven todate, then they need to repeat the testing completedto date 3,000 times (with no fatalities).Third, to mitigate the potential for missingsituations in the autonomous G&CS that lead tohazards during the performance-based testing(described above), the regulators and operator mustinsert a human operator with responsibility tointervene in the event of a hazardous situation. Thehuman operator’s role, however, must be anintentional design, not an afterthought. The roleand responsibilities of the human must be explicitlydesigned and supported by appropriate userinterfaces and the limitations of human reliability inmonitoring for rare-events must be considered.The implications of these lessons for thedesign and deployment of future autonomousshuttle bus systems and aviation systems arediscussed.This paper is organized as follows: Section 2describes the design and operation of theautonomous shuttle bus, the vehicle guidance andcontrol system, and the attendant. Section 3describes an accident scenario that occurredfollowing deployment of an autonomous G&CS.Section 4 discusses the issues and lessons learnedfrom the deployment of the autonomous G&CS andthe way it supports the safety-driver role. Section 5concludes with implications of this analysis,concepts to mitigate these issues, and future work.3,500 pounds (Figure 1). The shuttle had twosymmetrical ends, either of which serve as the frontFIGURE 1: Shuttle bus involved in thecollision. Passenger entry/exit through slidingside door. Seating for 11 and standing roomfor 4. Can operate in either directionforward/backward.or the rear. Passengers enter and exit throughdouble sliding doors on one of the long sides of thevehicle.The shuttle bus is powered by two batteries.An 80-volt traction battery operates the vehicle’selectric motor. This battery is located at one end ofthe shuttle and accessed from an external door. A12-volt battery serves as a back-up for operating thedoors and other miscellaneous non-propulsionfunctions. The backup battery is located in anenclosed space inside the passenger compartment(at the same end as the 80-volt battery).Two emergency stop buttons are located on eitherSystem ComponentsThe system had the following components: theshuttle, the autonomous shuttle guidance andcontrol system, and the attendant and their manualcontrol user-interfaceThe Shuttle BusThe shuttle bus is designed to transport a totalof 15 passengers, 11 seated and 4 standing. It is atwo-axle, battery-powered automated test vehiclewith a Gross Vehicle Weight Rating (GVWR) ofFIGURE 2: On long-side opposite door: (1)red emergency stop buttons (inset 1), (2)information display screen located (inset 2).

FIGURE 3: A canonical architecture for a G&CS includes three nested loopsside of the central window opposite the loadingdoors (Figure 2). Pushing an emergency stop buttonturns off the motor, activates three types of brakes,and turns on flashing hazard signals.A navigation touch screen is also located onthe long side opposite the loading doors (Figure 2).The screen displays information such as thebattery’s charge status and the vehicle route.The loading doors are equipped withemergency release handles. If the doors cannot notbe used to evacuate the shuttle, a hammer isavailable to break the central window, marked“emergency exit.”A fire extinguisher and first aid kit are storedunder the seats opposite the doors.As the vehicle is designed primarily forautonomous operation, the shuttle does not have asteering wheel, a brake, or an accelerator pedal.Shuttle Bus Autonomous Guidance andControl System (G&CS)The Shuttle Guidance and Control System(G&CS) commands the shuttle along a pre-definedpath defined by a sequence of legs/waypoints in anavigation data base. The pre-defined pathidentifies the latitude and longitude for each pointon the route and speeds or turn radii on eachsegment.To operate in an urban setting, the G&CS mustalso have knowledge of the obstacles (e.g.pedestrians, work zones, ), traffic, and trafficflow instructions (e.g. such as stop signs, trafficlights) and roadway features (such as grade).A canonical architecture for a G&CS includesthree nested loops (Figure 3). The inner-nested loopis path-tracking closed-loop Control for speedcontrol (I.e. accelerator, brake) and direction (i.e.steering). The function can be accurately designed

and coded using tradition rule-based/continuousfunctions.The desired path is determined by the nextouter-loop known as Guidance function. Thisfunction used GPS, IMU and rotational velocities todetermine the vehicle’s dynamic state, butsupplements this information with non-sterileenvironment information. The waypoint or segmentis adjusted to account for roadway conditions,traffic signals, traffic or other obstacles on thedesired path. The Guidance function can change thevelocity on the desired path, and/or perform lateralpath offsets.The outer-loop adjusts the mission forcontingencies. For fixed-route shuttle busoperations, the Mission Planning function is notinvoked.The Shuttle G&CS has limits on how much itcan deviate from the designated route. In the event,the shuttle must deviate from the route, an attendantmust disengage the autonomous control system andthe maneuver the vehicle using manual controls ormodify the mission plan.SensorsTo keep the shuttle on it’s designated path, theshuttle includes a differential Global PositioningSystem (GPS) to identify the vehicles latitude andlongitude, an Inertial Measurement Unit (IMU)measures the shuttle’s velocity, acceleration andangular rate to refine its position and verify itslocation. Also, an Anodometry device measures thespeed of the wheels to estimate changes in thevehicle’s position.To keep the shuttle from colliding with objectsin it’s path and to assist in location identification,the shuttle has eight LiDaR (light detection andranging) sensors and two stereoscopic cameras.LiDaR measures the distance to other objects usinga laser and has a detection range of 40 meters underideal conditions. Two LiDaRs are positioned on theroof to give a 360-degree view around the vehicle.The primary purpose of the LiDaR is to detectobstacles, whether moving or stationary (carsbacking out of parking spaces, motorcycles,bicycles, pedestrians, and so forth) on the roadwayor sidewalk. The LiDaR are also used to verify theshuttle’s location and path by matching objects andfeatures. The stereoscopic cameras were mountedon the shuttle to monitor the outside environment aswell as to analyze signs and traffic signals.The sensors and their locations are illustratedin Figure 4.The shuttle also has a dedicated short-rangecommunication system and a long-term evolutionantenna to communicate with traffic signals alongthe route.Another camera (fish-eye) was mounted on theFIGURE 4: Sensors and communication packages on the Shuttle.

ceiling of the shuttle to monitor passengers.The shuttle’s performance was monitored inreal time from a control center in Lyon, France. Thecontrol center operated 24 hours a day, 7 days aweek.The designated path is created during a mapmaking activity. The LiDaR and camera systemsrecord environmental features such as roadwaymarkings, curbs, stop lines, traffic signals, signs,road grade and curvature, and certain nontrafficstatic objects such as buildings. As the shuttletravels along the mapped coordinates of its path, thevarious systems continuously scan the environmentand verify that the detected objects (e.g stop signs)and roadway features (e.g grade) match those on themapped route at those specific locations.Part of the attendant’s regular duties includedusing the hand-held controller to load the shuttle onand off a tow truck (to take it to and from its routelocation), to maneuver the shuttle in the yard whereit is stored, and to manuever the shuttle into parkingspaces.The attendant is also required to operate theshuttle manually if an obstacle blocked its path. Thedoes not deviate outside its designated path (forexample, a stalled vehcile). In these situations, theattendant uses a handheld controller to maneuverthe shuttle around the obstacle and then return it toits path. The attendant then re-engage autonomousmode.Manual Controller Used by the AttendantA trained driver (attendant) can use manualcontrol to operate the shuttle outside the tonomously, regulatory authorities require anattendant to be on board to supervise and interveneif necessary.According to shuttle’s operatorbooklet, the attendant’s duties include:training(1) receiving passengers on board(2) checking that the vehicle functionedproperly(3) reporting errors to the supervision center(4) maintaining the security of passengersinside the vehicle and of pedestrians outside(5) reporting damage or injuries(6) monitoring and intervening as necessaryThe attendant is responsible for initiating theshuttle’s autonomous operation, can request stops atdesignated locations, and opens and closes thedoors. Only the Guidance and Control System caninitiate departure one the system is turned on.In the event of an unexpected or erroneousaction by the shuttle’s autonomous system, theattendant can notify Navya by pressing an intercombutton on a speaker next to the navigation touchscreen. Activating the intercom connects theattendant to the control center in France.FIGURE 5: Hand-held controller for manualoperation of the shuttlepredetermined path (for example, to move it from astorage location to its mapped route or to navigatearound stationary objects). This operation isaccomplished using an X-Box-style hand-heldcontroller (Figure 5).Pressing the “operator presence” button on thecontroller activates manual mode. In addition tosteering the shuttle, the controller engages theemergency brake, horn, or buzzer; opens/closes thedoors, activates the turn signals (blinkers). Pressingboth turn signal buttons activates the hazardwarning lights.Releasing the control button (green X at centerof controller) activates the emergency brake.Pressing the “standby” button disables propulsion.

Pressing two buttons on the controller(“operator presence” and “autonomous drive”)returns the shuttle to autonomous mode.Prior to this incident, the controller was storedin an enclosed space at one end of the passengercompartment. In the incident, the attendant did notretrieve the controller during the event. After thecrash, a new company policy was established suchthat Attendants now remove the controller from itsstorage space at the beginning of a trip and keep itavailable throughout the trip.The Accident ScenarioThe shuttle was a test vehicle, part of a pilotprogram in Las Vegas, and was on its first day ofpassenger-carrying operation (shuttle rides werefree) when the collision occurred. The designatedpath is 0.6 mile circular route with right turns only.The vehicle was limited to a maximum speed of 16miles per hour.The Shuttle started the trip at Container Parkon Fremont Street heading west (Figure 6). At thebeginning of the trip, the attendant boarded with thepassengers at Container Park and started the shuttlein autonomous mode. Eight seats were occupied atthe time of the collision.into an alley.The truck driver was backing into an alleyperpendicular to South 6th Street. This maneuver isa standard procedure to back into the alley as thetruck was “too long” to enter the alley from LasVegas Boulevard. Backing into the alley isconsidered the “only way . . . to be safe pullingout.”When the shuttle bus turned on to South 6thStreet, the tractor trailer, facing south, had pulled uppast the alley. Before backing up, the driver toldinvestigators that he activated his flashers as hepulled up to the alley. Two cars were behind him.He waved the two vehicles by.When the driver began backing up, he saw theshuttle turn onto South 6th Street from CarsonAvenue. He said that he knew the shuttle wasautomated having seen it previously doing “testruns” on Fremont Street. He said that he had noconcerns about sharing a road with the shuttle andassumed it would come to a stop to allow the truckto enter the alley.As the driver backed-up he paid particularattention to vehicles parked on the east side ofSouth 6th Street so as not to strike them. The carswere on the left side of the truck.The shuttle then turned right onto South 8thStreet, then right again onto East Carson Avenue. Itstopped at an information kiosk at South 8th Streetand at East Carson Avenue.At this time, the driver looked to the right andnoted that the shuttle was halfway down the street.He stated that he assumed the shuttle would stop a“reasonable” distance from the truck.The Shuttle then turned South onto South 6thStreet. A little over halfway down South 6th Streetthere was a tractor-trailer delivery truck backingThe driver said that he looked back to the leftand saw a pedestrian in the alley. He waited untilthe pedestrian cleared.During this time, the attendant toldinvestigators that the shuttle slowed as the shuttleapproached the tractor-trailer. The shuttle stopped3.1 meters (10.2 feet) from the tractor-trailer.Just before the shuttle stopped, the attendant,unsure if the shuttle would come to a complete stop,pressed one of the emergency stop buttons on thewall opposite the loading doors. Recorded datashows that the shuttle’s speed was less than 1 mph(0.249 meters per second, or 0.56 mph) when theattendant pressed the emergency stop button. Thisaction disengaged the autonomous G&CS. Theshuttle was now under manual control.FIGURE 6: Designated path, direction oftravel and location of alley. North up.

The truck driver saw the pedestrian clear to theleft of the truck and started backing up. Theattendant and the passengers became concerned thatthe tractor-trailer was on a trajectory to collide withthe shuttle, and waved to get the driver’s attention.Four cameras inside the shuttle showed theattendant and passengers waving to the truck driver.The tractor-trailer driver continued in reverse.While backing up tractor-trailer driver turned hisattention to the right, which was when the truck hitthe shuttle.The shuttle attendant believed the shuttle wasvisible to the truck driver in the right-side mirrorfrom the time the shuttle stopped until the collision.In a post-accident analysis of sight angles, parts ofthe surrogate shuttle were visible through thecollision between the tractor-trailer and theautonomously operated shuttle was:“the truck driver’s action of backing into analley, and his expectation that the shuttle wouldstop at a sufficient distance from his vehicle toallow him to complete his backup maneuver. [10].This is indicative of a missing situation in thedesign and testing of the G&CS. Given the distanceat which the shuttle bus stopped from the tractortrailer, perhaps the design did not adequatelydistinguish between a tractor-trailer crossing thestreet and a tractor-trailer backing-up with anassociated turn radius.Likewise in testing, the G&CS was notexposed to this “corner-case” in such a way as toconfirm the ability of the system to address thisscenario.The NTSB also cited as “contributing to thecause” of the collision was the:“attendant not being in a position to takemanual control of the vehicle in anemergency.”[10].These conclusions provide three issues for thedesign, testing and certification of autonomousguidance and control systems:FIGURE 7: Sequence of events leading tocollision. Shuttle bus stops short of tractor-trailerbut within the turn radius of the reversing tractortrailer1. Need to generate a complete andcomprehensive set of training data including“Corner-cases" with sufficient repetition ofMachine Learning Supervised Trainingwindows and in the mirrors on the right side of thetractor-trailer.2. Need to generate a complete andcomprehensive set of training data including“Corner-cases" with sufficient repetition ofMachine Learning Supervised TrainingEleven seconds after the shuttle stopped,according to the incident report, it was struck by theright front tire of the slow-moving tractor-trailer.The attendant said that he consideredswitching to manual mode to move the shuttle, butthat he had very little time. He further stated thatmanual mode was not designed or intended to beused as an emergency mode. That statement wasconsistent with shuttle operator’s policy, as reportedto NTSB investigators.DiscussionThe National Transportation Safety Board(NTSB) determined that the “probable cause” of the3. Need to design a user-interface to support thetask assigned to the “attendant” or “safetydriver” Tasks include monitoring but alsomeaningful and timely intervention.These issues and proposed solutions aresummarized in Table 1 and discussed below.TABLE 1: Three issues and proposed solutionsto address the challenges in design, testing anddeployment of guidance and control systems forautonomous vehiclesChallenges inDesign,IssueProposedSolutions

Testing andDeploymentDesignGenerating a complete Fast-timeand comprehensive set Emergentof training dataScenarioincluding “Corner-cases" Simulationwith sufficient repetition (FTESS)for Machine LearningSupervised TrainingTesting &Generating a complete ScenarioDemonstrationand comprehensive set Generation forof Safetyof training dataAugmentedCompliance including “CornerReality Testingcases" to ensureof Vehicle on acomplete andTreadmill.comprehensive testing Note:Continuessimultaneouslywithdeployment toidentifypotentiallyhazardousscenariosbefore theyoccur in thefieldPerformance Absence of explicitSafety Driverof Safetydesign of theMonitoringDriver forprocedures, tasks and SystemsTesting Period user-interface to(SDMS)in Responding support the taskto High RiskEvents duringTestingGenerating Sufficient Number of CornerCases for Supervised TrainingThe accident on the first loop of the first day ofoperations, demonstrates that the real-world urbanenvironment is complex beyond the imagination ofthe designers. Even with the best engineeringpractices and a simple route, there are nuancedcircumstances that can arise that are notappropriately captured in the G&CS.In this case, the G&CS correctly identified atractor-trailer obstruction. The G&CS brought theshuttle to a stop 10.3 feet from the tractor-trailer.This distance was appropriate for a tractor-trailercrossing the street, but not for a tractor-trailerbacking into an alley with a turn-radius. Theheadway distance assigned in the G&CS did notaccount for the trajectory resulting from a turningradius of the tractor-trailer.When the environment in which the G&CSoperates is relatively finite and simple, traditionalengineering practices can account for all thepossible combinations of situations that can occur.One example relatively finite and simpleenvironment is aircraft that operate in threedimensional airspace that is relatively sterilecompared to an urban street.When the environment in which the G&CSmust operate is complex and/or infinite, traditionalengineering practices are limited. In these cases,rather than designing the G&CS behavior usinghumans imagining the situations and correspondingbehavior, the engineering is conducted by MachineLearning algorithms that record, process, andencode the behavior of expert human operators. TheML algorithms must be exposed to all scenarioswith sufficient frequency that the ML can: (i)encode all plausible situation-behavior pairs, and(ii) encode subtle differences in situations thatrequire completely different responses from theG&CS.This approach involves logging millions ofmiles of real-world driving to attempt to capture allplausible situation-behavior pairs. For example, ifsituations associated with fatal accidents are takenas the least likely to occur and these accidents occurwith 1E-8 per mile, then autonomous vehicles needto perform at least 1E-10 per mile to get exposed tothese situation-behaviors (i.e. 100 times moremiles). The statistical "rule of 3" says that if N datapoints with a specified event are to be observed,then the 95% upper bound on the event estimate is3/N. To achieve 3/N, that would require 3E-10miles. If a manufacturer has 1E-7 miles driven todate, then they need to repeat the testing completedto date 3,000 times (with no fatalities).Sherry, Shortle, Donohue, Donnelly [11] andNanduri & Sherry [12] proposed a “Fast-TimeEmergent Scenario Simulation (FTESS).” FTESS isan agent-based, rare-event simulation “digital-twin”of the system.

FTESS can be used to expand the training datafor the ML algorithms with significant reduction istime. The FTESS starts with a “seed” scenario fromthe real-world driving and using Monte Carlotechniques on a super-computer generates varianceson the seed scenario.The FTESS leverages techniques for rare-eventsimulation, edge computing, and runs on a supercomputer. There are two main approaches � importance sampling (IS) andsplitting [13], [14], [15]. The idea of IS is to changethe underlying sampling distribution so that rareevents are more likely. The idea of splitting is tocreate separate copies of the simulation wheneverthe simulation gets ‘‘close’’ to the rare event ofinterest, effectively multiplying promising runs thatare more likely to reach the rare event. Splitting isuseful for systems that tend to take manyincremental steps on the path to the rare event. IS isalso useful for systems that tend to take a smallnumber of ‘‘catastrophic jumps’’ to the rare event.Inexpensive access to cloud-based Supercomputing. GMU has access to the ArgosSupercomputing Cluster. Ph.D. students are trainedto run parallel computing algorithms on the cluster.Agent-based models

provide lessons learned for the design and testing of future autonomous systems [10]. The analysis identified three main lessons learned for the design, testing and fielding of autonomous systems. First, the design of the guidance and control

Related Documents:

Page 2 Autonomous Systems Working Group Charter n Autonomous systems are here today.How do we envision autonomous systems of the future? n Our purpose is to explore the 'what ifs' of future autonomous systems'. - 10 years from now what are the emerging applications / autonomous platform of interest? - What are common needs/requirements across different autonomous

2 Research and development case study: Robotics and autonomous systems research Introduction This case study on robotics and autonomous systems research is one of a series that we have developed to support and complement our published report on research and development. Our examination of robotics and autonomous systems research

Autonomous Differential Equations 1. A differential equation of the form y0 F(y) is autonomous. 2. That is, if the right side does not depend on x, the equation is autonomous. 3. Autonomous equations are separable, but ugly integrals and expressions that cannot be solved for y make qualitative analysis sensible. 4.

Florida Statutes - Autonomous Vehicles (2016) HB 7027, signed April 4. th. 2016 - updates: F.S. 316.85 -Autonomous Vehicles; Operation (1) "A person who possesses a valid driver license may operate an autonomous vehicle in autonomous mode on roads in this state if the vehicle is equipped with autonomous technology, as defined in s. 316. .

systems to autonomous systems. Firstly, the autonomy and autonomous systems in different fields are summarized. The article classifies and summarizes the architecture of typical automated systems and infer three suggestions for building an autonomous system architecture: extensibility, evolvability, and collaborability.

xt-generation autonomous systems - Main Characteristics . Next-generation autonomous systems emerge from the needs to further automate existing complex organizations by progressive and incremental replacement of human agents by autonomous agents. Such systems exhibit "broad intelligence" by using and producing knowledge in order to

text, autonomous robots and mobile manipulators form the forefront of recent developments. In the Master's Program 'Autonomous Systems' (MAS) students will learn the practical skills and intellectual abilities necessary for the design and develop-ment of such autonomous systems. This Master's Program takes the form of a "Master by Research". So .

original reference. Referencing another writer’s graph. Figure 6. Effective gallic acid on biomass of Fusarium oxysporum f. sp. (Wu et al., 2009, p.300). A short guide to referencing figures and tables for Postgraduate Taught students Big Data assessment Data compression rate Data processing speed Time Efficiency Figure 5. Data processing speed, data compression rate and Big Data assessment .