Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Audit log proposal wip #4990

Closed
wants to merge 10 commits into from
220 changes: 220 additions & 0 deletions docs/proposals/proposal-audit-log.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,220 @@
# Audit log

### Purpose of audit log

The audit log is a feature that allows users to track changes made to their organization, projects,
schemas, and other resources.

## Events to track

### OrganizationManager

- `leaveOrganization`
- `createOrganization`
- `deleteOrganization`
capaj marked this conversation as resolved.
Show resolved Hide resolved
- `updatePlan`
- `updateRateLimits`
- `updateName`
- `deleteInvitation`
- `inviteByEmail`
- `joinOrganization`
- `requestOwnershipTransfer`
- `answerOwnershipTransferRequest`
- `deleteMember`
- `updateMemberAccess`
- `createMemberRole`
- `deleteMemberRole`
- `assignMemberRole`
- `updateMemberRole`
- `createRoleWithMembersMigration`
- `assignRoleToMembersMigration`

### ProjectManager

- `createProject`
- `deleteProject`
- `updateName`

### TargetManager

- `createTarget`
- `deleteTarget`
- `setTargetValidation`
- `updateTargetValidationSettings`
- `updateName`
- `updateTargetGraphQLEndpointUrl`
- `updateTargetSchemaComposition`

### SchemaPublisher

- `delete`

### SchemaManager

- `updateSchemaVersionStatus`
- `createVersion`
- `updateBaseSchema`
- `disableExternalSchemaComposition`
- `enableExternalSchemaComposition`
- `updateNativeSchemaComposition`
- `updateRegistryModel`
- `approveFailedSchemaCheck`

### BillingProvider

- `upgradeToPro`
- `syncOrganization`
- `downgradeToHobby`

### CdnProvider

- `createCDNAccessToken`
- `deleteCDNAccessToken`

### CollectionProvider

- `createCollection`
- `deleteCollection`
- `createOperation`
- `updateOperation`
- `updateCollection`
- `deleteOperation`

### GitHubIntegrationManager

- `register`
- `unregister`
- `enableProjectNameInGithubCheck`

### SlackIntegrationManager

- `register`
- `unregister`

### OIDCIntegrationsProvider

- `createOIDCIntegrationForOrganization`
- `updateOIDCIntegration`
- `deleteOIDCIntegration`

### Contracts

- `createContract`
- `disableContract`

### AlertsManager

- `addChannel`
- `deleteChannels`
- `addAlert`
- `deleteAlerts`

## Storage

- updateUser
- updateOrganizationName
- updateOrganizationPlan
- updateOrganizationRateLimits
- createOrganizationInvitation
- deleteOrganizationInvitationByEmail
- createOrganizationTransferRequest
- answerOrganizationTransferRequest
- addOrganizationMemberViaInvitationCode
- deleteOrganizationMember
- updateOrganizationMemberAccess
- assignOrganizationMemberRole
- assignOrganizationMemberRoleToMany
- deleteOrganizationMemberRole
- updateProjectRegistryModel
- createVersion
- updateVersionStatus
- createActivity
- addSlackIntegration
- deleteSlackIntegration
- addGitHubIntegration
- deleteGitHubIntegration
- setSchemaPolicyForOrganization
- setSchemaPolicyForProject
- createDocumentCollection
- deleteDocumentCollection
- updateDocumentCollection
- createDocumentCollectionDocument
- deleteDocumentCollectionDocument
- updateDocumentCollectionDocument
- createSchemaCheck


## Implementation

Proposed implementation is to use a single Clickhouse table to store all events. The table will have
the following columns:


```sql
CREATE TABLE audit_log (
event_time DateTime DEFAULT now(),
user_id UUID,
organization_id UUID,
project_id UUID,
project_name STRING,
target_id UUID,
target_name STRING,
schema_version_id UUID,
event_kind STRING,
event_action STRING,
event_details JSON
) ENGINE = MergeTree ()
ORDER BY event_time
TTL timestamp + INTERVAL 3 MONTH;
```
Data in clickhouse would be append only. We would never manually delete them, we would rely on TTL removing old rows after X months
Of course we could make the interval configurable if necessary.

Our log function will be a simple function that inserts a row into the table:

```ts
import { ClickHouse } from 'clickhouse';

const clickhouse = new ClickHouse(/* ... */)

type AuditLogEvent = {
userId: string;
capaj marked this conversation as resolved.
Show resolved Hide resolved
organizationId?: string;
projectId?: string;
targetId?: string;
schemaVersionId?: string;
action: string;
details: Record<string, any>;
eventKind: string;
}

const logAuditEvent = async (event: AuditLogEvent) => {
const { userId, organizationId, projectId, targetId, schemaVersionId, action, details, eventKind } = event;
const query = `
INSERT INTO audit_log (user_id, organization_id, project_id, project_name, target_id, target_name, schema_version_id, event_kind, event_action, event_details)
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
`;
await clickhouse.query(query, [
userId,
organizationId,
projectId,
projectId,
targetId,
schemaVersionId,
eventKind,
JSON.stringify(details),
]);
};

// sample usage
// do not await to avoid blocking the the thread
logAuditEvent({
userId: '690ae6ae-30e7-4e6c-8114-97e50e41aee5',
organizationId: 'da2dbbf8-6c03-4abf-964d-8a2d949da5cb',
eventKind: 'OrganizationManager',
capaj marked this conversation as resolved.
Show resolved Hide resolved
action: 'joinOrganization',
})
```

We would call this `logAuditEvent` function in all the places in the codebase to log the events
listed above.
capaj marked this conversation as resolved.
Show resolved Hide resolved
75 changes: 75 additions & 0 deletions scripts/get-services-outline.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
import * as fs from 'fs';
import * as path from 'path';
import { fileURLToPath } from 'url';
import ts from 'typescript';

function getAllFiles(dir: string, ext: string, files: string[] = []): string[] {
const items = fs.readdirSync(dir);
items.forEach(item => {
const filePath = path.join(dir, item);
if (fs.statSync(filePath).isDirectory()) {
getAllFiles(filePath, ext, files);
} else if (filePath.endsWith(ext)) {
files.push(filePath);
}
});
return files;
}

// Function to extract classes and methods from a TypeScript source file
function extractClassesAndMethods(fileName: string, sourceFile: ts.SourceFile) {
const classes: { name: string; methods: string[] }[] = [];

function visit(node: ts.Node) {
if (ts.isClassDeclaration(node) && node.name) {
const className = node.name.getText();
const methods: string[] = [];
node.members.forEach(member => {
if (ts.isMethodDeclaration(member) && member.name) {
methods.push(member.name.getText());
}
});
classes.push({
name: className,
methods: methods.filter(method => {
return (
method.startsWith('get') === false &&
method.startsWith('count') === false &&
method.startsWith('has') === false &&
method.startsWith('read') === false
);
}),
});
}
ts.forEachChild(node, visit);
}

visit(sourceFile);
return classes;
}

/**
* script to print out classes and methods from a TypeScript source files in tsDirectory
*/

const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const tsDirectory = 'packages/services/api/src/modules/';
const projectDir = path.resolve(__dirname, tsDirectory); // Change 'src' to your project directory
const files = getAllFiles(projectDir, '.ts');

const classesAndMethods: { file: string; classes: { name: string; methods: string[] }[] }[] = [];

files.forEach(file => {
const sourceFile = ts.createSourceFile(
file,
fs.readFileSync(file, 'utf8'),
ts.ScriptTarget.Latest,
true,
);
const classes = extractClassesAndMethods(file, sourceFile);
if (classes.length > 0) {
classesAndMethods.push({ file, classes });
console.log(JSON.stringify(classes, null, 2));
}
});
Loading