sUAS incident data collected from multiple sources and used to construct the knowledge base of the S-agent can be accessed here: Go to AutoSIMTestFramework-->knowledge_base.csv
Data collected and used to build the knowledge-base of Analytics Agent can be found [here](Go to Px4-Flight-Controller-Params
These flight logs were generated by injecting failures in the PX4 flight controller. The logs were used to analyze the performance of the Analytics-Agent. Please access the flight logs here:
The code developed to simulate run-time sensor errors during Simulation can be found Go to SuT → px4 → standalone
The mission inputs were generated using the model-based design approach for 5 different sUAS use-cases. The mission inputs can be accessed Go to Sample-of-Expected-Output → Mission-Samples-For-Each-Use-Case. This data was used to analyze whether the Mission-Agent can generate correct mission inputs for different sUAS use-cases.
We designed our own System-Under-Tests. The implementation of these SuTs can be found Go to SuT. We tested our SuT using the AutoSimTest Framework.
Go to Analysis → Feasibility → Px4_SuT → Autonomous_Navigation → missions.json (Sample of Expected Mission Input by Px4)
Go to SuT → px4 → airsim-setting-sample.json (Sample of Airsim Settings to execute a Px4 Mission)
Go to Analysis → Feasibility → Ardu_SuT → Autonomous_Navigation → missions.json (Sample of Expected Mission Input by Ardupilot)
Go to Analysis → Feasibility → Ardu → Waypoint_Navigation → environment.json (Sample of Ardupilot-Sitl Simulator Input)
Go to Analysis → Generalizability → S-Agent-Scenario.xlsx
Mission and Environment Config Files Generated by the Mission and Environment Agents Across the 5 Use-Cases
- Go to Agent_output → mission_json (JSON Generated by M-Agent for 25 scenarios)
- Go to Agent_output → Environment_json (JSON Generated by Env-Agent for 25 scenarios)
Go to Agent_output → analytics_agent_output (Report Generated by Analytics Agent)
The implementation of the AutoSimTest Framework including all Agents and communication between then is available here Go to AutoSimTestFramework for replication and further research and analysis.
Follow these steps to set up and run the AUTOSIMTEST Framework Codebase.
- Install Conda.
- Ensure you have Python installed and available in your system PATH.
-
Clone the Repository
git clone https://github.com/UAVLab-SLU/AutoSimTestFramework.git cd AutoSimTestFramework/AutoSimTestFramework
-
Create and Activate a Virtual Environment
conda create -n autosim_env python=3.8 conda activate autosim_env
-
Install Dependencies
pip install -r requirements.txt
-
Run the Flask Backend
python app.py
-
Run the Gradio Frontend
- Launch the Gradio-based frontend for communicating with the S-Agent, Env-Agent, and M-Agent
python frontend2.py
Run the Gradio Frontend
- Launch the Gradio-based frontend for communicating with Analytics-Agent
python Analytics_Agent_front_end.py