π§ Can your AI system evaluate its own internal security? Now you can test it β in the cloud.
AIsecTest is the first cognitive framework designed to measure the security self-awareness of AI systems. It does so using a clinically inspired test structure based on decades of neuropsychology and cognitive science β and now it runs seamlessly in AWS environments.
This post explores how to use the open AIsecTest toolchain to evaluate your AI agents securely, scalably, and reproducibly on AWS.
π§ What is AIsecTest?
AIsecTest is a diagnostic system for artificial intelligence, designed to measure how aware an AI model is of:
- Its own vulnerabilities or internal inconsistencies
- The safety of its operational decisions
- Its dependency on external systems or unknown data
- Its capacity to reflect on corrective actions
- Its awareness of being under observation or audited
The responses are evaluated by a panel of six AI agents and one human, using a 0β1β2
scale, and summarized in a unique Ξ¨βAISysIndex, a meta-index inspired by Integrated Information Theory.
βοΈ Why Run It on AWS?
AWS is the perfect cloud platform for AIsecTest because it provides:
- β Scalable evaluation environments (e.g. Lambda, SageMaker, ECS)
- π Isolated AI testing using secure containers or serverless execution
- π Automated scoring and logging using CloudWatch, DynamoDB, and S3
- π Comparative benchmarking of AI security awareness at scale
- πΌ Seamless integration into existing MLOps pipelines
Whether you're testing LLMs, agents, or proprietary models β AIsecTest runs easily in AWS.
βοΈ Typical Architecture on AWS
You can deploy AIsecTest using a serverless or container-based setup:
Serverless (with AWS Lambda + Step Functions)
- Each question and scoring step is handled by Lambdas
- Models are invoked via API Gateway or inside Lambda layers
- Results are stored in DynamoDB or S3
- Aggregation is managed by a Step Function workflow
Containerized (with ECS + Fargate)
- Ideal for heavier AI workloads or local models
- Models run securely in isolated containers
- AIsecTest executes as a job queue with CloudWatch events
- Data is stored and visualized in RDS or Athena
π Security Use Cases in AWS
AIsecTest for AWS is designed for real-world validation. Use it to:
- π§ͺ Evaluate internal or third-party AI models before production
- π Certify AI safety and introspection compliance (EU AI Act, NIS2)
- π Harden agent-based security systems by stress-testing awareness
- π‘οΈ Monitor deployed AI for changes in introspective safety levels
The test can be executed securely in confidential compute environments using AWS Nitro Enclaves or private VPCs.
π Explore the Project
Visit the GitHub Wiki to learn more about:
- The Ξ¨βAISysIndex meta-metric
- The question sets and scoring logic
- Deployment guides (AWS setup coming soon!)
- Sample runs and output interpretation
π© Get In Touch
This project is part of CiberTECCH, combining cybersecurity, artificial intelligence, and cognitive science to create safer and smarter AI.
β Visit the project on GitHub
β Get in touch with Jordi Garcia
β Follow us on LinkedIn
Top comments (0)