AWS API calls indicating DataPipeline privilege escalation
Description
AlphaSOC detected an IAM policy change granting permissions for privilege
escalation through AWS Data Pipeline. This detection identifies policies
combining iam:PassRole, datapipeline:CreatePipeline,
datapipeline:PutPipelineDefinition, and datapipeline:ActivatePipeline. The
detection triggers on CreatePolicy, CreatePolicyVersion, PutUserPolicy,
PutGroupPolicy, PutRolePolicy, AttachUserPolicy, AttachRolePolicy,
AttachGroupPolicy events that introduce these permissions.
These permissions enable an attacker to create a pipeline, define activities
that execute arbitrary code, attach a privileged IAM role via iam:PassRole,
and activate the pipeline to run with elevated permissions.Pipeline activities
can execute shell commands on EC2 instances, run SQL queries on databases, or
perform tasks on EMR clusters using the passed role's permissions. Attackers can
craft pipeline definitions that access AWS resources or exfiltrate data. Since
Data Pipeline is designed for ETL workflows, malicious pipeline execution blends
with legitimate data processing activity.
Impact
If exploited, the attacker can execute arbitrary code and data processing workflows using the permissions of privileged Data Pipeline execution roles. This enables access to data sources and targets across the environment, credential theft from pipeline execution contexts, data exfiltration through pipeline activities, and establishment of persistence via scheduled pipelines. Data Pipeline roles often have broad permissions to read from multiple data sources and write to data lakes.
Severity
| Severity | Condition |
|---|---|
Low | AWS API calls indicating DataPipeline privilege escalation |
Investigation and Remediation
Check CloudTrail for the PutUserPolicy, PutGroupPolicy, PutRolePolicy, or
Attach\*Policy event that added Data Pipeline permissions with iam:PassRole.
Identify the target principal and review requestParameters to examine the
exact permissions granted. Verify whether these permissions are required for
legitimate data engineering work, as they should typically be restricted to data
engineering teams with specific ETL responsibilities.
If unauthorized, immediately detach or delete the policy. Review CloudTrail for
CreatePipeline, PutPipelineDefinition, or ActivatePipeline actions
performed by the affected principal. Check AWS Data Pipeline for pipelines
created by the principal. Review pipeline definitions for suspicious activities
including shell commands, script execution, or data transfers to external
locations. Examine IAM roles passed to these pipelines. Deactivate and delete
suspicious pipelines. Review pipeline execution logs in CloudWatch to identify
accessed resources and processed data. Rotate credentials for the affected
principal and any Data Pipeline execution roles that may have been exploited.
Known False Positives
- Authorized data engineers provisioning Data Pipeline permissions for ETL workflow automation
- Infrastructure-as-code deployments creating service roles for data integration frameworks
- Data platform teams setting up orchestration for data lake ingestion