AWS Clean Rooms now supports configurable compute size for PySpark, offering customers the flexibility to customize and allocate resources to run PySpark jobs based on their performance, scale, and cost requirements. With this launch, customers can specify the instance type and cluster size at job runtime for each analysis that uses PySpark, the Python API for Apache Spark. For example, customers can use large instance configurations to achieve the performance needed for their complex data sets and analyses, or smaller instances to optimize costs.
AWS Clean Rooms helps companies and their partners easily analyze and collaborate on their collective datasets without revealing or copying one another’s underlying data. For more information about the AWS Regions where AWS Clean Rooms is available, see the AWS Regions table. To learn more about collaborating with AWS Clean Rooms, visit AWS Clean Rooms.
Categories: general:products/aws-clean-rooms,marketing:marchitecture/analytics
Source: Amazon Web Services

![Power Pages - Security scan recommendations in Security Hub [MC1184033] 2 pexels shkrabaanthony 5243990](https://mwpro.co.uk/wp-content/uploads/2025/06/pexels-shkrabaanthony-5243990-150x150.webp)


![(Updated) New Tools feature coming to the Microsoft Copilot Chat prompt box [MC1122153] 5 pexels federico orlandi 1423142 3260626.bak](https://mwpro.co.uk/wp-content/uploads/2024/08/pexels-federico-orlandi-1423142-3260626.bak_-150x150.webp)
