The AI Data Security Assistant is a generative AI-powered chatbot integrated into Thales File Activity Monitoring (FAM) that streamlines security investigations and compliance tasks. By understanding natural language queries, it enables security, compliance, and IT teams to quickly analyze file access activity, detect policy violations, and generate detailed audit reports without deep technical expertise.
Whether identifying anomalous behavior, tracing sensitive data access, or preparing for GDPR and HIPAA audits, the assistant dramatically reduces manual workload and enhances response speed—turning complex data trails into clear, actionable insights in seconds.
As a SOC engineer investigating access to sensitive data within the organization, I'll demonstrate how to leverage the generative AI data security assistant for this critical task.
The interface displays recent chat history alongside popular prompts including failed access attempts to encrypted files and external IP file access events. For this investigation, I'll begin with a fundamental query: "Which classification profiles were accessed?"
Classification profiles encompass various data protection standards such as HIPAA sensitive data, GDPR sensitive data, PCI DSS data, and numerous additional classification categories. Thales File Activity Monitoring automatically classifies sensitive information according to these established profiles.
The Data Security Assistant processes the query and returns seventeen classification profiles that users within the organization have accessed. To gain deeper insight into user activity patterns, I'll submit a follow-up query: "How many users accessed each classification profile?"
The system analyzes the request and constructs the appropriate database query. Results indicate that between one and two users accessed sensitive files within each classification profile. The response includes both the summary data showing unique user counts per classification profile and detailed query information revealing the exact database commands executed to retrieve these results.
The interface provides suggested follow-up questions such as "Which classification profile was accessed by the most users?" along with options to generate reports or view results within the reporting platform.
Focusing on HIPAA data specifically, I'll query: "Which users accessed HIPAA sensitive data?" The results show that users identified as "N/A" and "System" accessed this protected information.
To investigate the System user's activity further, I'll ask: "Which files did System access?" This query returns seven files containing HIPAA sensitive data that the System user accessed.
The Data Security Assistant suggests several investigative paths: "What specific actions did System perform on the HIPAA sensitive data?", "How frequently does System access HIPAA sensitive data?", and "Are there any unusual patterns in System's access to HIPAA data?"
Selecting the frequency analysis reveals that System accessed HIPAA sensitive data 135 times during the specified period.
To establish ongoing monitoring, I'll create a recurring report by selecting the report generation option. The system opens a configuration dialog with an auto-generated report name and description. I can set the recurrence schedule—in this case, every Monday at midnight—and choose delivery options including data storage or email distribution in multiple formats such as PDF or CSV. The email configuration allows customization of templates, subject lines, and recipient lists to ensure stakeholders receive timely security intelligence.