diff --git a/404.html b/404.html index d0c1b1bb0..7b90e7c24 100644 --- a/404.html +++ b/404.html @@ -6,13 +6,13 @@ Page Not Found | Booster Framework - +
Skip to main content

Page Not Found

We could not find what you were looking for.

Please contact the owner of the site that linked you to the original URL and let them know their link is broken.

- + \ No newline at end of file diff --git a/Home/index.html b/Home/index.html index 926c9e411..c147499ac 100644 --- a/Home/index.html +++ b/Home/index.html @@ -6,13 +6,13 @@ Booster Framework - +
Skip to main content
Booster Logo

Build serverless event-sourcing microservices in minutes instead of months!

Booster is an open-source minimalistic TypeScript framework to build event-sourced services with the minimal amount of code possible, but don't let its innocent appearance fool you; Booster analyzes the semantics of your code, sets up theoptimal infrastructure to run your application at scale, and even generates a fully-working GraphQL API for you – don't even mind about writing the resolvers or maintaining your GraphQL schema, it will do that for you too.

And have we mentioned it's all open-source and free? But not free like you have a few build minutes per month or anything like that, we mean, real free. Everything remains between you, your CI/CD scripts (wherever you want to put them), and your own cloud accounts. Nothing is hidden under the carpet, you can visit the Github repository and see every single detail.

- + \ No newline at end of file diff --git a/architecture/command/index.html b/architecture/command/index.html index 7a86e2898..258d4e287 100644 --- a/architecture/command/index.html +++ b/architecture/command/index.html @@ -6,13 +6,13 @@ Command | Booster Framework - +
-
Skip to main content

Command

Commands are any action a user performs on your application. For example, RemoveItemFromCart, RatePhoto or AddCommentToPost. They express the intention of an user, and they are the main interaction mechanism of your application. They are a similar to the concept of a request on a REST API. Command issuers can also send data on a command as parameters.

Creating a command

The Booster CLI will help you to create new commands. You just need to run the following command and the CLI will generate all the boilerplate for you:

boost new:command CreateProduct --fields sku:SKU displayName:string description:string price:Money

This will generate a new file called create-product in the src/commands directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI.

Declaring a command

In Booster you define them as TypeScript classes decorated with the @Command decorator. The Command parameters will be declared as properties of the class.

src/commands/command-name.ts
@Command()
export class CommandName {
public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType) {}
}

These commands are handled by Command Handlers, the same way a REST Controller do with a request. To create a Command handler of a specific Command, you must declare a handle class function inside the corresponding command you want to handle. For example:

src/commands/command-name.ts
@Command()
export class CommandName {
public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType) {}

public static async handle(command: CommandName, register: Register): Promise<void> {
// Validate inputs
// Run domain logic
// register.events([event1,...])
}
}

Booster will then generate the GraphQL mutation for the corresponding command, and the infrastructure to handle them. You only have to define the class and the handler function. Commands are part of the public API, so you can define authorization policies for them, you can read more about this on the authorization section.

tip

We recommend using command handlers to validate input data before registering events into the event store because they are immutable once there.

The command handler function

Each command class must have a method called handle. This function is the command handler, and it will be called by the framework every time one instance of this command is submitted. Inside the handler you can run validations, return errors, query entities to make decisions, and register relevant domain events.

Registering events

Within the command handler execution, it is possible to register domain events. The command handler function receives the register argument, so within the handler, it is possible to call register.events(...) with a list of events.

src/commands/create-product.ts
@Command()
export class CreateProduct {
public constructor(readonly sku: string, readonly price: number) {}

public static async handle(command: CreateProduct, register: Register): Promise<string> {
register.event(new ProductCreated(/*...*/))
}
}

For more details about events and the register parameter, see the Events section.

Returning a value

The command handler function can return a value. This value will be the response of the GraphQL mutation. By default, the command handler function expects you to return a void as a return type. Since GrahpQL does not have a void type, the command handler function returns true when called through the GraphQL. This is because the GraphQL specification requires a response, and true is the most appropriate value to represent a successful execution with no return value.

If you want to return a value, you can change the return type of the handler function. For example, if you want to return a string:

For example:

src/commands/create-product.ts
@Command()
export class CreateProduct {
public constructor(readonly sku: string, readonly price: number) {}

public static async handle(command: CreateProduct, register: Register): Promise<string> {
register.event(new ProductCreated(/*...*/))
return 'Product created!'
}
}

Validating data

tip

Booster uses the typed nature of GraphQL to ensure that types are correct before reaching the handler, so you don't have to validate types.

Throw an error

A command will fail if there is an uncaught error during its handling. When a command fails, Booster will return a detailed error response with the message of the thrown error. This is useful for debugging, but it is also a security feature. Booster will never return an error stack trace to the client, so you don't have to worry about exposing internal implementation details.

One case where you might want to throw an error is when the command is invalid because it breaks a business rule. For example, if the command contains a negative price. In that case, you can throw an error in the handler. Booster will use the error's message as the response to make it descriptive. For example, given this command:

src/commands/create-product.ts
@Command()
export class CreateProduct {
public constructor(readonly sku: string, readonly price: number) {}

public static async handle(command: CreateProduct, register: Register): Promise<void> {
const priceLimit = 10
if (command.price >= priceLimit) {
throw new Error(`price must be below ${priceLimit}, and it was ${command.price}`)
}
}
}

You'll get something like this response:

{
"errors": [
{
"message": "price must be below 10, and it was 19.99",
"path": ["CreateProduct"]
}
]
}

Register error events

There could be situations in which you want to register an event representing an error. For example, when moving items with insufficient stock from one location to another:

src/commands/move-stock.ts
@Command()
export class MoveStock {
public constructor(
readonly productID: string,
readonly origin: string,
readonly destination: string,
readonly quantity: number
) {}

public static async handle(command: MoveStock, register: Register): Promise<void> {
if (!command.enoughStock(command.productID, command.origin, command.quantity)) {
register.events(new ErrorEvent(`There is not enough stock for ${command.productID} at ${command.origin}`))
} else {
register.events(new StockMoved(/*...*/))
}
}

private enoughStock(productID: string, origin: string, quantity: number): boolean {
/* ... */
}
}

In this case, the command operation can still be completed. An event handler will take care of that `ErrorEvent and proceed accordingly.

Reading entities

Event handlers are a good place to make decisions and, to make better decisions, you need information. The Booster.entity function allows you to inspect the application state. This function receives two arguments, the Entity's name to fetch and the entityID. Here is an example of fetching an entity called Stock:

src/commands/move-stock.ts
@Command()
export class MoveStock {
public constructor(
readonly productID: string,
readonly origin: string,
readonly destination: string,
readonly quantity: number
) {}

public static async handle(command: MoveStock, register: Register): Promise<void> {
const stock = await Booster.entity(Stock, command.productID)
if (!command.enoughStock(command.origin, command.quantity, stock)) {
register.events(new ErrorEvent(`There is not enough stock for ${command.productID} at ${command.origin}`))
}
}

private enoughStock(origin: string, quantity: number, stock?: Stock): boolean {
const count = stock?.countByLocation[origin]
return !!count && count >= quantity
}
}

Authorizing a command

Commands are part of the public API of a Booster application, so you can define who is authorized to submit them. All commands are protected by default, which means that no one can submit them. In order to allow users to submit a command, you must explicitly authorize them. You can use the authorize field of the @Command decorator to specify the authorization rule.

src/commands/create-product.ts
@Command({
authorize: 'all',
})
export class CreateProduct {
public constructor(
readonly sku: Sku,
readonly displayName: string,
readonly description: string,
readonly price: number
) {}

public static async handle(command: CreateProduct, register: Register): Promise<void> {
register.events(/* YOUR EVENT HERE */)
}
}

You can read more about this on the Authorization section.

Submitting a command

Booster commands are accessible to the outside world as GraphQL mutations. GrahpQL fits very well with Booster's CQRS approach because it has two kinds of operations: Mutations and Queries. Mutations are actions that modify the server-side data, just like commands.

Booster automatically creates one mutation per command. The framework infers the mutation input type from the command fields. Given this CreateProduct command:

@Command({
authorize: 'all',
})
export class CreateProduct {
public constructor(
readonly sku: Sku,
readonly displayName: string,
readonly description: string,
readonly price: number
) {}

public static async handle(command: CreateProduct, register: Register): Promise<void> {
register.events(/* YOUR EVENT HERE */)
}
}

Booster generates the following GraphQL mutation:

mutation CreateProduct($input: CreateProductInput!): Boolean

where the schema for CreateProductInput is

{
sku: String
displayName: String
description: String
price: Float
}

Commands naming convention

Semantics are very important in Booster as it will play an essential role in designing a coherent system. Your application should reflect your domain concepts, and commands are not an exception. Although you can name commands in any way you want, we strongly recommend you to name them starting with verbs in imperative plus the object being affected. If we were designing an e-commerce application, some commands would be:

  • CreateProduct
  • DeleteProduct
  • UpdateProduct
  • ChangeCartItems
  • ConfirmPayment
  • MoveStock
  • UpdateCartShippingAddress

Despite you can place commands, and other Booster files, in any directory, we strongly recommend you to put them in <project-root>/src/commands. Having all the commands in one place will help you to understand your application's capabilities at a glance.

<project-root>
├── src
│   ├── commands <------ put them here
│   ├── common
│   ├── config
│   ├── entities
│   ├── events
│   ├── index.ts
│   └── read-models
- +
Skip to main content

Command

Commands are any action a user performs on your application. For example, RemoveItemFromCart, RatePhoto or AddCommentToPost. They express the intention of an user, and they are the main interaction mechanism of your application. They are a similar to the concept of a request on a REST API. Command issuers can also send data on a command as parameters.

Creating a command

The Booster CLI will help you to create new commands. You just need to run the following command and the CLI will generate all the boilerplate for you:

boost new:command CreateProduct --fields sku:SKU displayName:string description:string price:Money

This will generate a new file called create-product in the src/commands directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI.

Declaring a command

In Booster you define them as TypeScript classes decorated with the @Command decorator. The Command parameters will be declared as properties of the class.

src/commands/command-name.ts
@Command()
export class CommandName {
public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType) {}
}

These commands are handled by Command Handlers, the same way a REST Controller do with a request. To create a Command handler of a specific Command, you must declare a handle class function inside the corresponding command you want to handle. For example:

src/commands/command-name.ts
@Command()
export class CommandName {
public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType) {}

public static async handle(command: CommandName, register: Register): Promise<void> {
// Validate inputs
// Run domain logic
// register.events([event1,...])
}
}

Booster will then generate the GraphQL mutation for the corresponding command, and the infrastructure to handle them. You only have to define the class and the handler function. Commands are part of the public API, so you can define authorization policies for them, you can read more about this on the authorization section.

tip

We recommend using command handlers to validate input data before registering events into the event store because they are immutable once there.

The command handler function

Each command class must have a method called handle. This function is the command handler, and it will be called by the framework every time one instance of this command is submitted. Inside the handler you can run validations, return errors, query entities to make decisions, and register relevant domain events.

Registering events

Within the command handler execution, it is possible to register domain events. The command handler function receives the register argument, so within the handler, it is possible to call register.events(...) with a list of events.

src/commands/create-product.ts
@Command()
export class CreateProduct {
public constructor(readonly sku: string, readonly price: number) {}

public static async handle(command: CreateProduct, register: Register): Promise<string> {
register.event(new ProductCreated(/*...*/))
}
}

For more details about events and the register parameter, see the Events section.

Returning a value

The command handler function can return a value. This value will be the response of the GraphQL mutation. By default, the command handler function expects you to return a void as a return type. Since GrahpQL does not have a void type, the command handler function returns true when called through the GraphQL. This is because the GraphQL specification requires a response, and true is the most appropriate value to represent a successful execution with no return value.

If you want to return a value, you can change the return type of the handler function. For example, if you want to return a string:

For example:

src/commands/create-product.ts
@Command()
export class CreateProduct {
public constructor(readonly sku: string, readonly price: number) {}

public static async handle(command: CreateProduct, register: Register): Promise<string> {
register.event(new ProductCreated(/*...*/))
return 'Product created!'
}
}

Validating data

tip

Booster uses the typed nature of GraphQL to ensure that types are correct before reaching the handler, so you don't have to validate types.

Throw an error

A command will fail if there is an uncaught error during its handling. When a command fails, Booster will return a detailed error response with the message of the thrown error. This is useful for debugging, but it is also a security feature. Booster will never return an error stack trace to the client, so you don't have to worry about exposing internal implementation details.

One case where you might want to throw an error is when the command is invalid because it breaks a business rule. For example, if the command contains a negative price. In that case, you can throw an error in the handler. Booster will use the error's message as the response to make it descriptive. For example, given this command:

src/commands/create-product.ts
@Command()
export class CreateProduct {
public constructor(readonly sku: string, readonly price: number) {}

public static async handle(command: CreateProduct, register: Register): Promise<void> {
const priceLimit = 10
if (command.price >= priceLimit) {
throw new Error(`price must be below ${priceLimit}, and it was ${command.price}`)
}
}
}

You'll get something like this response:

{
"errors": [
{
"message": "price must be below 10, and it was 19.99",
"path": ["CreateProduct"]
}
]
}

Register error events

There could be situations in which you want to register an event representing an error. For example, when moving items with insufficient stock from one location to another:

src/commands/move-stock.ts
@Command()
export class MoveStock {
public constructor(
readonly productID: string,
readonly origin: string,
readonly destination: string,
readonly quantity: number
) {}

public static async handle(command: MoveStock, register: Register): Promise<void> {
if (!command.enoughStock(command.productID, command.origin, command.quantity)) {
register.events(new ErrorEvent(`There is not enough stock for ${command.productID} at ${command.origin}`))
} else {
register.events(new StockMoved(/*...*/))
}
}

private enoughStock(productID: string, origin: string, quantity: number): boolean {
/* ... */
}
}

In this case, the command operation can still be completed. An event handler will take care of that `ErrorEvent and proceed accordingly.

Reading entities

Event handlers are a good place to make decisions and, to make better decisions, you need information. The Booster.entity function allows you to inspect the application state. This function receives two arguments, the Entity's name to fetch and the entityID. Here is an example of fetching an entity called Stock:

src/commands/move-stock.ts
@Command()
export class MoveStock {
public constructor(
readonly productID: string,
readonly origin: string,
readonly destination: string,
readonly quantity: number
) {}

public static async handle(command: MoveStock, register: Register): Promise<void> {
const stock = await Booster.entity(Stock, command.productID)
if (!command.enoughStock(command.origin, command.quantity, stock)) {
register.events(new ErrorEvent(`There is not enough stock for ${command.productID} at ${command.origin}`))
}
}

private enoughStock(origin: string, quantity: number, stock?: Stock): boolean {
const count = stock?.countByLocation[origin]
return !!count && count >= quantity
}
}

Authorizing a command

Commands are part of the public API of a Booster application, so you can define who is authorized to submit them. All commands are protected by default, which means that no one can submit them. In order to allow users to submit a command, you must explicitly authorize them. You can use the authorize field of the @Command decorator to specify the authorization rule.

src/commands/create-product.ts
@Command({
authorize: 'all',
})
export class CreateProduct {
public constructor(
readonly sku: Sku,
readonly displayName: string,
readonly description: string,
readonly price: number
) {}

public static async handle(command: CreateProduct, register: Register): Promise<void> {
register.events(/* YOUR EVENT HERE */)
}
}

You can read more about this on the Authorization section.

Submitting a command

Booster commands are accessible to the outside world as GraphQL mutations. GrahpQL fits very well with Booster's CQRS approach because it has two kinds of operations: Mutations and Queries. Mutations are actions that modify the server-side data, just like commands.

Booster automatically creates one mutation per command. The framework infers the mutation input type from the command fields. Given this CreateProduct command:

@Command({
authorize: 'all',
})
export class CreateProduct {
public constructor(
readonly sku: Sku,
readonly displayName: string,
readonly description: string,
readonly price: number
) {}

public static async handle(command: CreateProduct, register: Register): Promise<void> {
register.events(/* YOUR EVENT HERE */)
}
}

Booster generates the following GraphQL mutation:

mutation CreateProduct($input: CreateProductInput!): Boolean

where the schema for CreateProductInput is

{
sku: String
displayName: String
description: String
price: Float
}

Commands naming convention

Semantics are very important in Booster as it will play an essential role in designing a coherent system. Your application should reflect your domain concepts, and commands are not an exception. Although you can name commands in any way you want, we strongly recommend you to name them starting with verbs in imperative plus the object being affected. If we were designing an e-commerce application, some commands would be:

  • CreateProduct
  • DeleteProduct
  • UpdateProduct
  • ChangeCartItems
  • ConfirmPayment
  • MoveStock
  • UpdateCartShippingAddress

Despite you can place commands, and other Booster files, in any directory, we strongly recommend you to put them in <project-root>/src/commands. Having all the commands in one place will help you to understand your application's capabilities at a glance.

<project-root>
├── src
│   ├── commands <------ put them here
│   ├── common
│   ├── config
│   ├── entities
│   ├── events
│   ├── index.ts
│   └── read-models
+ \ No newline at end of file diff --git a/architecture/entity/index.html b/architecture/entity/index.html index 64fb2de1b..25b9983c4 100644 --- a/architecture/entity/index.html +++ b/architecture/entity/index.html @@ -6,13 +6,13 @@ Entity | Booster Framework - +
-
Skip to main content

Entity

If events are the source of truth of your application, entities are the current state of your application. For example, if you have an application that allows users to create bank accounts, the events would be something like AccountCreated, MoneyDeposited, MoneyWithdrawn, etc. But the entities would be the BankAccount themselves, with the current balance, owner, etc.

Entities are created by reducing the whole event stream. Booster generates entities on the fly, so you don't have to worry about their creation. However, you must define them in order to instruct Booster how to generate them.

info

Under the hood, Booster stores snapshots of the entities in order to reduce the load on the event store. That way, Booster doesn't have to reduce the whole event stream whenever the current state of an entity is needed.

Creating entities

The Booster CLI will help you to create new entities. You just need to run the following command and the CLI will generate all the boilerplate for you:

boost new:entity Product --fields displayName:string description:string price:Money

This will generate a new file called product.ts in the src/entities directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI.

Declaring an entity

To declare an entity in Booster, you must define a class decorated with the @Entity decorator. Inside of the class, you must define a constructor with all the fields you want to have in your entity.

src/entities/entity-name.ts
@Entity
export class EntityName {
public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType /* as many fields as needed */) {}
}

The reduce function

In order to tell Booster how to reduce the events, you must define a static method decorated with the @Reduces decorator. This method will be called by the framework every time an event of the specified type is emitted. The reducer method must return a new entity instance with the current state of the entity.

src/entities/entity-name.ts
@Entity
export class EntityName {
public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType /* as many fields as needed */) {}

@Reduces(SomeEvent)
public static reduceSomeEvent(event: SomeEvent, currentEntityState?: EntityName): EntityName {
/* Return a new entity based on the current one */
}
}

The reducer method receives two parameters:

  • event - The event object that triggered the reducer
  • currentEntity? - The current state of the entity instance that the event belongs to if it exists. This parameter is optional and will be undefined if the entity doesn't exist yet (For example, when you process a ProductCreated event that will generate the first version of a Product entity).

Reducing multiple events

You can define as many reducer methods as you want, each one for a different event type. For example, if you have a Cart entity, you could define a reducer for ProductAdded events and another one for ProductRemoved events.

src/entities/cart.ts
@Entity
export class Cart {
public constructor(readonly items: Array<CartItem>) {}

@Reduces(ProductAdded)
public static reduceProductAdded(event: ProductAdded, currentCart?: Cart): Cart {
const newItems = addToCart(event.item, currentCart)
return new Cart(newItems)
}

@Reduces(ProductRemoved)
public static reduceProductRemoved(event: ProductRemoved, currentCart?: Cart): Cart {
const newItems = removeFromCart(event.item, currentCart)
return new Cart(newItems)
}
}
tip

It's highly recommended to keep your reducer functions pure, which means that you should be able to produce the new entity version by just looking at the event and the current entity state. You should avoid calling third party services, reading or writing to a database, or changing any external state.

There could be a lot of events being reduced concurrently among many entities, but, for a specific entity instance, the events order is preserved. This means that while one event is being reduced, all other events of any kind that belong to the same entity instance will be waiting in a queue until the previous reducer has finished. This is how Booster guarantees that the entity state is consistent.

reducer process gif

Eventual Consistency

Additionally, due to the event driven and async nature of Booster, your data might not be instantly updated. Booster will consume the commands, generate events, and eventually generate the entities. Most of the time this is not perceivable, but under huge loads, it could be noticed.

This property is called Eventual Consistency, and it is a trade-off to have high availability for extreme situations, where other systems might simply fail.

Entity ID

In order to identify each entity instance, you must define an id field on each entity. This field will be used by the framework to identify the entity instance. If the value of the id field matches the value returned by the entityID() method of an Event, the framework will consider that the event belongs to that entity instance.

src/entities/entity-name.ts
@Entity
export class EntityName {
public constructor(
readonly id: UUID,
readonly fieldA: SomeType,
readonly fieldB: SomeOtherType /* as many fields as needed */
) {}

@Reduces(SomeEvent)
public static reduceSomeEvent(event: SomeEvent, currentEntityState?: EntityName): EntityName {
/* Return a new entity based on the current one */
}
}
tip

We recommend you to use the UUID type for the id field. You can generate a new UUID value by calling the UUID.generate() method already provided by the framework.

Entities naming convention

Entities are a representation of your application state in a specific moment, so name them as closely to your domain objects as possible. Typical entity names are nouns that might appear when you think about your app. In an e-commerce application, some entities would be:

  • Cart
  • Product
  • UserProfile
  • Order
  • Address
  • PaymentMethod
  • Stock

Entities live within the entities directory of the project source: <project-root>/src/entities.

<project-root>
├── src
│ ├── commands
│ ├── common
│ ├── config
│ ├── entities <------ put them here
│ ├── events
│ ├── index.ts
│ └── read-models
- +
Skip to main content

Entity

If events are the source of truth of your application, entities are the current state of your application. For example, if you have an application that allows users to create bank accounts, the events would be something like AccountCreated, MoneyDeposited, MoneyWithdrawn, etc. But the entities would be the BankAccount themselves, with the current balance, owner, etc.

Entities are created by reducing the whole event stream. Booster generates entities on the fly, so you don't have to worry about their creation. However, you must define them in order to instruct Booster how to generate them.

info

Under the hood, Booster stores snapshots of the entities in order to reduce the load on the event store. That way, Booster doesn't have to reduce the whole event stream whenever the current state of an entity is needed.

Creating entities

The Booster CLI will help you to create new entities. You just need to run the following command and the CLI will generate all the boilerplate for you:

boost new:entity Product --fields displayName:string description:string price:Money

This will generate a new file called product.ts in the src/entities directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI.

Declaring an entity

To declare an entity in Booster, you must define a class decorated with the @Entity decorator. Inside of the class, you must define a constructor with all the fields you want to have in your entity.

src/entities/entity-name.ts
@Entity
export class EntityName {
public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType /* as many fields as needed */) {}
}

The reduce function

In order to tell Booster how to reduce the events, you must define a static method decorated with the @Reduces decorator. This method will be called by the framework every time an event of the specified type is emitted. The reducer method must return a new entity instance with the current state of the entity.

src/entities/entity-name.ts
@Entity
export class EntityName {
public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType /* as many fields as needed */) {}

@Reduces(SomeEvent)
public static reduceSomeEvent(event: SomeEvent, currentEntityState?: EntityName): EntityName {
/* Return a new entity based on the current one */
}
}

The reducer method receives two parameters:

  • event - The event object that triggered the reducer
  • currentEntity? - The current state of the entity instance that the event belongs to if it exists. This parameter is optional and will be undefined if the entity doesn't exist yet (For example, when you process a ProductCreated event that will generate the first version of a Product entity).

Reducing multiple events

You can define as many reducer methods as you want, each one for a different event type. For example, if you have a Cart entity, you could define a reducer for ProductAdded events and another one for ProductRemoved events.

src/entities/cart.ts
@Entity
export class Cart {
public constructor(readonly items: Array<CartItem>) {}

@Reduces(ProductAdded)
public static reduceProductAdded(event: ProductAdded, currentCart?: Cart): Cart {
const newItems = addToCart(event.item, currentCart)
return new Cart(newItems)
}

@Reduces(ProductRemoved)
public static reduceProductRemoved(event: ProductRemoved, currentCart?: Cart): Cart {
const newItems = removeFromCart(event.item, currentCart)
return new Cart(newItems)
}
}
tip

It's highly recommended to keep your reducer functions pure, which means that you should be able to produce the new entity version by just looking at the event and the current entity state. You should avoid calling third party services, reading or writing to a database, or changing any external state.

There could be a lot of events being reduced concurrently among many entities, but, for a specific entity instance, the events order is preserved. This means that while one event is being reduced, all other events of any kind that belong to the same entity instance will be waiting in a queue until the previous reducer has finished. This is how Booster guarantees that the entity state is consistent.

reducer process gif

Eventual Consistency

Additionally, due to the event driven and async nature of Booster, your data might not be instantly updated. Booster will consume the commands, generate events, and eventually generate the entities. Most of the time this is not perceivable, but under huge loads, it could be noticed.

This property is called Eventual Consistency, and it is a trade-off to have high availability for extreme situations, where other systems might simply fail.

Entity ID

In order to identify each entity instance, you must define an id field on each entity. This field will be used by the framework to identify the entity instance. If the value of the id field matches the value returned by the entityID() method of an Event, the framework will consider that the event belongs to that entity instance.

src/entities/entity-name.ts
@Entity
export class EntityName {
public constructor(
readonly id: UUID,
readonly fieldA: SomeType,
readonly fieldB: SomeOtherType /* as many fields as needed */
) {}

@Reduces(SomeEvent)
public static reduceSomeEvent(event: SomeEvent, currentEntityState?: EntityName): EntityName {
/* Return a new entity based on the current one */
}
}
tip

We recommend you to use the UUID type for the id field. You can generate a new UUID value by calling the UUID.generate() method already provided by the framework.

Entities naming convention

Entities are a representation of your application state in a specific moment, so name them as closely to your domain objects as possible. Typical entity names are nouns that might appear when you think about your app. In an e-commerce application, some entities would be:

  • Cart
  • Product
  • UserProfile
  • Order
  • Address
  • PaymentMethod
  • Stock

Entities live within the entities directory of the project source: <project-root>/src/entities.

<project-root>
├── src
│ ├── commands
│ ├── common
│ ├── config
│ ├── entities <------ put them here
│ ├── events
│ ├── index.ts
│ └── read-models
+ \ No newline at end of file diff --git a/architecture/event-driven/index.html b/architecture/event-driven/index.html index 86912eda7..4bc2c3f06 100644 --- a/architecture/event-driven/index.html +++ b/architecture/event-driven/index.html @@ -6,13 +6,13 @@ Booster architecture | Booster Framework - +
-
Skip to main content

Booster architecture

Booster is a highly opinionated framework that provides a complete toolset to build production-ready event-driven serverless applications.

Two patterns influence the Booster's event-driven architecture: Command-Query Responsibility Segregation (CQRS) and Event Sourcing. They're complex techniques to implement from scratch with lower-level frameworks, but Booster makes them feel natural and very easy to use.

architecture

As you can see in the diagram, Booster applications consist of four main building blocks: Commands, Events, Entities, and Read Models. Commands and Read Models are the public interface of the application, while Events and Entities are private implementation details. With Booster, clients submit Commands, query the Read Models, or subscribe to them for receiving real-time updates thanks to the out of the box GraphQL API

Booster applications are event-driven and event-sourced so, the source of truth is the whole history of events. When a client submits a command, Booster wakes up and handles it throght Command Handlers. As part of the process, some Events may be registered as needed.

On the other side, the framework caches the current state by automatically reducing all the registered events into Entities. You can also react to events via Event Handlers, triggering side effect actions to certain events. Finally, Entities are not directly exposed, they are transformed or projected into ReadModels, which are exposed to the public.

In this chapter you'll walk through these concepts in detail.

- +
Skip to main content

Booster architecture

Booster is a highly opinionated framework that provides a complete toolset to build production-ready event-driven serverless applications.

Two patterns influence the Booster's event-driven architecture: Command-Query Responsibility Segregation (CQRS) and Event Sourcing. They're complex techniques to implement from scratch with lower-level frameworks, but Booster makes them feel natural and very easy to use.

architecture

As you can see in the diagram, Booster applications consist of four main building blocks: Commands, Events, Entities, and Read Models. Commands and Read Models are the public interface of the application, while Events and Entities are private implementation details. With Booster, clients submit Commands, query the Read Models, or subscribe to them for receiving real-time updates thanks to the out of the box GraphQL API

Booster applications are event-driven and event-sourced so, the source of truth is the whole history of events. When a client submits a command, Booster wakes up and handles it throght Command Handlers. As part of the process, some Events may be registered as needed.

On the other side, the framework caches the current state by automatically reducing all the registered events into Entities. You can also react to events via Event Handlers, triggering side effect actions to certain events. Finally, Entities are not directly exposed, they are transformed or projected into ReadModels, which are exposed to the public.

In this chapter you'll walk through these concepts in detail.

+ \ No newline at end of file diff --git a/architecture/event-handler/index.html b/architecture/event-handler/index.html index e28bb1566..d4bcc69b9 100644 --- a/architecture/event-handler/index.html +++ b/architecture/event-handler/index.html @@ -6,13 +6,13 @@ Event handler | Booster Framework - +
-
Skip to main content

Event handler

An event handler is a class that reacts to events. They are commonly used to trigger side effects in case of a new event. For instance, if a new event is registered in the system, an event handler could send an email to the user.

Creating an event handler

The Booster CLI will help you to create new event handlers. You just need to run the following command and the CLI will generate all the boilerplate for you:

boost new:event-handler HandleAvailability --event StockMoved

This will generate a new file called handle-availability.ts in the src/event-handlers directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI.

Declaring an event handler

In Booster, event handlers are classes decorated with the @EventHandler decorator. The parameter of the decorator is the event that the handler will react to. The logic to be triggered after an event is registered is defined in the handle method of the class. This handle function will receive the event that triggered the handler.

src/event-handlers/handle-availability.ts
@EventHandler(StockMoved)
export class HandleAvailability {
public static async handle(event: StockMoved): Promise<void> {
// Do something here
}
}

Creating an event handler

Event handlers can be easily created using the Booster CLI command boost new:event-handler. There are two mandatory arguments: the event handler name, and the name of the event it will react to. For instance:

boost new:event-handler HandleAvailability --event StockMoved

Once the creation is completed, there will be a new file in the event handlers directory <project-root>/src/event-handlers/handle-availability.ts.

<project-root>
├── src
│ ├── commands
│ ├── common
│ ├── config
│ ├── entities
│ ├── events
│ ├── event-handlers <------ put them here
│ └── read-models

Registering events from an event handler

Event handlers can also register new events. This is useful when you want to trigger a new event after a certain condition is met. For example, if you want to send an email to the user when a product is out of stock.

In order to register new events, Booster injects the register instance in the handle method as a second parameter. This register instance has a events(...) method that allows you to store any side effect events, you can specify as many as you need separated by commas as arguments of the function.

src/event-handlers/handle-availability.ts
@EventHandler(StockMoved)
export class HandleAvailability {
public static async handle(event: StockMoved, register: Register): Promise<void> {
if (event.quantity < 0) {
register.events([new ProductOutOfStock(event.productID)])
}
}
}

Reading entities from event handlers

There are cases where you need to read an entity to make a decision based on its current state. Different side effects can be triggered depending on the current state of the entity. Given the previous example, if a user does not want to receive emails when a product is out of stock, we should be able check the user preferences before sending the email.

For that reason, Booster provides the Booster.entity function. This function allows you to retrieve the current state of an entity. Let's say that we want to check the status of a product before we trigger its availability update. In that case we would call the Booster.entity function, which will return information about the entity.

src/event-handlers/handle-availability.ts
@EventHandler(StockMoved)
export class HandleAvailability {
public static async handle(event: StockMoved, register: Register): Promise<void> {
const product = await Booster.entity(Product, event.productID)
if (product.stock < 0) {
register.events([new ProductOutOfStock(event.productID)])
}
}
}
- +
Skip to main content

Event handler

An event handler is a class that reacts to events. They are commonly used to trigger side effects in case of a new event. For instance, if a new event is registered in the system, an event handler could send an email to the user.

Creating an event handler

The Booster CLI will help you to create new event handlers. You just need to run the following command and the CLI will generate all the boilerplate for you:

boost new:event-handler HandleAvailability --event StockMoved

This will generate a new file called handle-availability.ts in the src/event-handlers directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI.

Declaring an event handler

In Booster, event handlers are classes decorated with the @EventHandler decorator. The parameter of the decorator is the event that the handler will react to. The logic to be triggered after an event is registered is defined in the handle method of the class. This handle function will receive the event that triggered the handler.

src/event-handlers/handle-availability.ts
@EventHandler(StockMoved)
export class HandleAvailability {
public static async handle(event: StockMoved): Promise<void> {
// Do something here
}
}

Creating an event handler

Event handlers can be easily created using the Booster CLI command boost new:event-handler. There are two mandatory arguments: the event handler name, and the name of the event it will react to. For instance:

boost new:event-handler HandleAvailability --event StockMoved

Once the creation is completed, there will be a new file in the event handlers directory <project-root>/src/event-handlers/handle-availability.ts.

<project-root>
├── src
│ ├── commands
│ ├── common
│ ├── config
│ ├── entities
│ ├── events
│ ├── event-handlers <------ put them here
│ └── read-models

Registering events from an event handler

Event handlers can also register new events. This is useful when you want to trigger a new event after a certain condition is met. For example, if you want to send an email to the user when a product is out of stock.

In order to register new events, Booster injects the register instance in the handle method as a second parameter. This register instance has a events(...) method that allows you to store any side effect events, you can specify as many as you need separated by commas as arguments of the function.

src/event-handlers/handle-availability.ts
@EventHandler(StockMoved)
export class HandleAvailability {
public static async handle(event: StockMoved, register: Register): Promise<void> {
if (event.quantity < 0) {
register.events([new ProductOutOfStock(event.productID)])
}
}
}

Reading entities from event handlers

There are cases where you need to read an entity to make a decision based on its current state. Different side effects can be triggered depending on the current state of the entity. Given the previous example, if a user does not want to receive emails when a product is out of stock, we should be able check the user preferences before sending the email.

For that reason, Booster provides the Booster.entity function. This function allows you to retrieve the current state of an entity. Let's say that we want to check the status of a product before we trigger its availability update. In that case we would call the Booster.entity function, which will return information about the entity.

src/event-handlers/handle-availability.ts
@EventHandler(StockMoved)
export class HandleAvailability {
public static async handle(event: StockMoved, register: Register): Promise<void> {
const product = await Booster.entity(Product, event.productID)
if (product.stock < 0) {
register.events([new ProductOutOfStock(event.productID)])
}
}
}
+ \ No newline at end of file diff --git a/architecture/event/index.html b/architecture/event/index.html index 0594a8514..a30c77cf4 100644 --- a/architecture/event/index.html +++ b/architecture/event/index.html @@ -6,13 +6,13 @@ Event | Booster Framework - +
-
Skip to main content

Event

An event is a fact of something that has happened in your application. Every action that takes place on your application should be stored as an event. They are stored in a single collection, forming a set of immutable records of facts that contain the whole story of your application. This collection of events is commonly known as the Event Store.

Creating an event

The Booster CLI will help you to create new events. You just need to run the following command and the CLI will generate all the boilerplate for you:

boost new:event StockMoved --fields productID:string origin:string destination:string quantity:number

This will generate a new file called stock-moved.ts in the src/events directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI.

Declaring an event

Events are the cornerstone of Booster because of its event-driven and event-sourced nature. Booster events are TypeScript classes decorated with @Event. An event class may look like this:

src/events/event-name.ts
@Event
export class EventName {
public constructor(readonly field1: SomeType, readonly field2: SomeOtherType) {}
}

The class name is the name of the event. The event name is used to identify the event in the application. It is also used to generate the GraphQL schema. The class parameter names are the names of the fields of the event and their types are the types of the fields of the event.

Events and entities

Events and Entities are closely related. Each event will be aggregated (or reduced) into an entity. Therefore, Booster needs a way to know which entity is associated with each event. For that reason, it is required to provide an entity ID with each event. You can declare it with a class function named entityID. For example:

src/events/cart-paid.ts
@Event
export class CartPaid {
public constructor(readonly cartID: UUID, readonly paymentID: UUID) {}

public entityID(): UUID {
// returns cartID because we want to associate it with
// (and reduce it within) the Cart entity
return this.cartID
}
}
tip

If your domain requires a Singleton entity, where there's only one instance of that entity in your whole application, you can return a constant value.

caution

Make sure that the entityID method always returns the same value for the same event's instance. Otherwise, the result of the entity reduction will be unpredictable.

Registering events in the event store

We have shown you how to declare an event in Booster, but we haven't explained how to store them in the event store. In Booster terminology, creating an instance of an event and storing in the event store is known as registering it. You can do that on Booster using the register.events(...) function. The register object is provided as a parameter in the handle method of both commands and the event handlers. For example:

Registering events from command handlers

src/commands/move-stock.ts
@Command({
authorize: [Admin],
})
export class MoveStock {
public constructor(
readonly productID: string,
readonly origin: string,
readonly destination: string,
readonly quantity: number
) {}

public static async handle(command: MoveStock, register: Register): Promise<void> {
if (!command.enoughStock(command.origin, command.quantity, command.productID)) {
register.events(new ErrorEvent(`There is not enough stock for ${command.productID} at ${command.origin}`))
}
}
}

Registering events from event handlers

src/event-handlers/stock-moved.ts
@EventHandler(StockMoved)
export class HandleAvailability {
public static async handle(event: StockMoved, register: Register): Promise<void> {
register.events(new ProductAvailabilityChanged(event.productID, event.quantity))
}
}
}

Events naming convention

As with commands, you can name events in any way you want, depending on your application's domain. However, we recommend you to choose short sentences written in past tense because events are facts that have happened and can't be changed. Some event names would be:

  • ProductCreated
  • ProductUpdated
  • ProductDeleted
  • CartItemChanged
  • StockMoved

As with other Booster files, events have their own directory:

<project-root>
├── src
│ ├── commands
│ ├── common
│ ├── config
│ ├── entities
│ ├── events <------ put them here
│ ├── index.ts
│ └── read-models
- +
Skip to main content

Event

An event is a fact of something that has happened in your application. Every action that takes place on your application should be stored as an event. They are stored in a single collection, forming a set of immutable records of facts that contain the whole story of your application. This collection of events is commonly known as the Event Store.

Creating an event

The Booster CLI will help you to create new events. You just need to run the following command and the CLI will generate all the boilerplate for you:

boost new:event StockMoved --fields productID:string origin:string destination:string quantity:number

This will generate a new file called stock-moved.ts in the src/events directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI.

Declaring an event

Events are the cornerstone of Booster because of its event-driven and event-sourced nature. Booster events are TypeScript classes decorated with @Event. An event class may look like this:

src/events/event-name.ts
@Event
export class EventName {
public constructor(readonly field1: SomeType, readonly field2: SomeOtherType) {}
}

The class name is the name of the event. The event name is used to identify the event in the application. It is also used to generate the GraphQL schema. The class parameter names are the names of the fields of the event and their types are the types of the fields of the event.

Events and entities

Events and Entities are closely related. Each event will be aggregated (or reduced) into an entity. Therefore, Booster needs a way to know which entity is associated with each event. For that reason, it is required to provide an entity ID with each event. You can declare it with a class function named entityID. For example:

src/events/cart-paid.ts
@Event
export class CartPaid {
public constructor(readonly cartID: UUID, readonly paymentID: UUID) {}

public entityID(): UUID {
// returns cartID because we want to associate it with
// (and reduce it within) the Cart entity
return this.cartID
}
}
tip

If your domain requires a Singleton entity, where there's only one instance of that entity in your whole application, you can return a constant value.

caution

Make sure that the entityID method always returns the same value for the same event's instance. Otherwise, the result of the entity reduction will be unpredictable.

Registering events in the event store

We have shown you how to declare an event in Booster, but we haven't explained how to store them in the event store. In Booster terminology, creating an instance of an event and storing in the event store is known as registering it. You can do that on Booster using the register.events(...) function. The register object is provided as a parameter in the handle method of both commands and the event handlers. For example:

Registering events from command handlers

src/commands/move-stock.ts
@Command({
authorize: [Admin],
})
export class MoveStock {
public constructor(
readonly productID: string,
readonly origin: string,
readonly destination: string,
readonly quantity: number
) {}

public static async handle(command: MoveStock, register: Register): Promise<void> {
if (!command.enoughStock(command.origin, command.quantity, command.productID)) {
register.events(new ErrorEvent(`There is not enough stock for ${command.productID} at ${command.origin}`))
}
}
}

Registering events from event handlers

src/event-handlers/stock-moved.ts
@EventHandler(StockMoved)
export class HandleAvailability {
public static async handle(event: StockMoved, register: Register): Promise<void> {
register.events(new ProductAvailabilityChanged(event.productID, event.quantity))
}
}
}

Events naming convention

As with commands, you can name events in any way you want, depending on your application's domain. However, we recommend you to choose short sentences written in past tense because events are facts that have happened and can't be changed. Some event names would be:

  • ProductCreated
  • ProductUpdated
  • ProductDeleted
  • CartItemChanged
  • StockMoved

As with other Booster files, events have their own directory:

<project-root>
├── src
│ ├── commands
│ ├── common
│ ├── config
│ ├── entities
│ ├── events <------ put them here
│ ├── index.ts
│ └── read-models
+ \ No newline at end of file diff --git a/architecture/notifications/index.html b/architecture/notifications/index.html index 49b5e46c0..9eeb5f77f 100644 --- a/architecture/notifications/index.html +++ b/architecture/notifications/index.html @@ -6,13 +6,13 @@ Notifications | Booster Framework - +
-
Skip to main content

Notifications

Notifications are an important concept in event-driven architecture, and they play a crucial role in informing interested parties about certain events that take place within an application.

Declaring a notification

In Booster, notifications are defined as classes decorated with the @Notification decorator. Here's a minimal example to illustrate this:

src/notifications/cart-abandoned.ts
import { Notification } from '@boostercloud/framework-core'

@Notification()
export class CartAbandoned {}

As you can see, to define a notification you simply need to import the @Notification decorator from the @boostercloud/framework-core library and use it to decorate a class. In this case, the class CartAbandoned represents a notification that informs interested parties that a cart has been abandoned.

Separating by topic

By default, all notifications in the application will be sent to the same topic called defaultTopic. To configure this, you can specify a different topic name in the @Notification decorator:

src/notifications/cart-abandoned-topic.ts
import { Notification } from '@boostercloud/framework-core'

@Notification({ topic: 'cart-abandoned' })
export class CartAbandoned {}

In this example, the CartAbandoned notification will be sent to the cart-abandoned topic, instead of the default topic.

Separating by partition key

By default, all the notifications in the application will share a partition key called default. This means that, by default, all the notifications in the application will be processed in order, which may not be as performant.

To change this, you can use the @partitionKey decorator to specify a field that will be used as a partition key for each notification:

src/notifications/cart-abandoned-partition-key.ts
import { Notification, partitionKey } from '@boostercloud/framework-core'

@Notification({ topic: 'cart-abandoned' })
export class CartAbandoned {
public constructor(@partitionKey readonly key: string) {}
}

In this example, each CartAbandoned notification will have its own partition key, which is specified in the constructor as the field key, it can be called in any way you want. This will allow for parallel processing of notifications, making the system more performant.

Reacting to notifications

Just like events, notifications can be handled by event handlers in order to trigger other processes. Event handlers are responsible for listening to events and notifications, and then performing specific actions in response to them.

In conclusion, defining notifications in the Booster Framework is a simple and straightforward process that can be done using the @Notification and @partitionKey decorators.

- +
Skip to main content

Notifications

Notifications are an important concept in event-driven architecture, and they play a crucial role in informing interested parties about certain events that take place within an application.

Declaring a notification

In Booster, notifications are defined as classes decorated with the @Notification decorator. Here's a minimal example to illustrate this:

src/notifications/cart-abandoned.ts
import { Notification } from '@boostercloud/framework-core'

@Notification()
export class CartAbandoned {}

As you can see, to define a notification you simply need to import the @Notification decorator from the @boostercloud/framework-core library and use it to decorate a class. In this case, the class CartAbandoned represents a notification that informs interested parties that a cart has been abandoned.

Separating by topic

By default, all notifications in the application will be sent to the same topic called defaultTopic. To configure this, you can specify a different topic name in the @Notification decorator:

src/notifications/cart-abandoned-topic.ts
import { Notification } from '@boostercloud/framework-core'

@Notification({ topic: 'cart-abandoned' })
export class CartAbandoned {}

In this example, the CartAbandoned notification will be sent to the cart-abandoned topic, instead of the default topic.

Separating by partition key

By default, all the notifications in the application will share a partition key called default. This means that, by default, all the notifications in the application will be processed in order, which may not be as performant.

To change this, you can use the @partitionKey decorator to specify a field that will be used as a partition key for each notification:

src/notifications/cart-abandoned-partition-key.ts
import { Notification, partitionKey } from '@boostercloud/framework-core'

@Notification({ topic: 'cart-abandoned' })
export class CartAbandoned {
public constructor(@partitionKey readonly key: string) {}
}

In this example, each CartAbandoned notification will have its own partition key, which is specified in the constructor as the field key, it can be called in any way you want. This will allow for parallel processing of notifications, making the system more performant.

Reacting to notifications

Just like events, notifications can be handled by event handlers in order to trigger other processes. Event handlers are responsible for listening to events and notifications, and then performing specific actions in response to them.

In conclusion, defining notifications in the Booster Framework is a simple and straightforward process that can be done using the @Notification and @partitionKey decorators.

+ \ No newline at end of file diff --git a/architecture/queries/index.html b/architecture/queries/index.html index 6fc9f3e05..29ad1f447 100644 --- a/architecture/queries/index.html +++ b/architecture/queries/index.html @@ -6,13 +6,13 @@ Queries | Booster Framework - +
-
Skip to main content

Queries

ReadModels offer read operations over reduced events. On the other hand, Queries provide a way to do custom read operations.

Queries are classes decorated with the @Query decorator that have a handle method.

import { Booster, NonExposed, Query } from '@boostercloud/framework-core'
import { QueryInfo, QueryInput, UserEnvelope, UUID } from '@boostercloud/framework-types'
import { Cart } from '../entities/cart'
import {
beforeHookQueryID,
beforeHookQueryMultiply,
queryHandlerErrorCartId,
queryHandlerErrorCartMessage,
} from '../constants'

@Query({
authorize: 'all',
})
export class CartTotalQuantity {
public constructor(readonly cartId: UUID, @NonExposed readonly multiply: number) {}

public static async handle(query: CartTotalQuantity, queryInfo: QueryInfo): Promise<number> {
const cart = await Booster.entity(Cart, query.cartId)
if (!cart || !cart.cartItems || cart.cartItems.length === 0) {
return 0
}
return cart?.cartItems
.map((cartItem) => cartItem.quantity)
.reduce((accumulator, value) => {
return accumulator + value
}, 0)
}
}

Queries naming convention

We recommend use the Query suffix in your queries name.

Despite you can place your queries in any directory, we strongly recommend you to put them in <project-root>/src/queries.

<project-root>
├── src
│   ├── commands
│   ├── common
│   ├── config
│   ├── entities
│   ├── read-models
│   ├── events
│   ├── queries <------ put them here
│   └── index.ts

Creating a query

The preferred way to create a query is by using the generator, e.g.

boost new:query ItemsInCountry --fields country:string

The generator will create a Typescript class under the queries directory <project-root>/src/queries/items-in-country.ts.

Queries classes can also be created by hand and there are no restrictions. The structure of the data is totally open and can be as complex as you can manage in your projection functions.

The query handler function

Each query class must have a method called handle. This function is the command handler, and it will be called by the framework every time one instance of this query is submitted. Inside the handler you can run validations, return errors and query entities to make decisions.

Handler function receive a QueryInfo object to let users interact with the execution context. It can be used for a variety of purposes, including:

  • Access the current signed in user, their roles and other claims included in their JWT token
  • Access the request context or alter the HTTP response headers

Validating data

Booster uses the typed nature of GraphQL to ensure that types are correct before reaching the handler, so you don't have to validate types.

Throw an error

There are still business rules to be checked before proceeding with a query. For example, a given number must be between a threshold or a string must match a regular expression. In that case, it is enough just to throw an error in the handler. Booster will use the error's message as the response to make it descriptive.

Registering events

Within the query handler execution, it is not possible to register domain events. If you need to register events, then use a Command. For more details about events and the register parameter, see the Events section.

Authorizing queries

You can define who is authorized to access your queries. The Booster authorization feature is covered in the auth section. So far, we have seen that you can make a query publicly accessible by authorizing 'all' to query it, or you can set specific roles providing an array of roles in this way: authorize: [Admin].

Querying

For every query, Booster automatically creates the corresponding GraphQL query. For example, given this CartTotalQuantityQuery:

@Query({
authorize: 'all',
})
export class CartTotalQuantityQuery {
public constructor(readonly cartId: UUID) {}

public static async handle(query: CartTotalQuantity, queryInfo: QueryInfo): Promise<number> {
const cart = await Booster.entity(Cart, query.cartId)
if (!cart || !cart.cartItems || cart.cartItems.length === 0) {
return 0
}
return cart?.cartItems
.map((cartItem) => cartItem.quantity)
.reduce((accumulator, value) => {
return accumulator + value
}, 0)
}
}

You will get the following GraphQL query and subscriptions:

query CartTotalQuantityQuery($cartId: ID!): Float!

[!NOTE] Query subscriptions are not supported yet

- +
Skip to main content

Queries

ReadModels offer read operations over reduced events. On the other hand, Queries provide a way to do custom read operations.

Queries are classes decorated with the @Query decorator that have a handle method.

import { Booster, NonExposed, Query } from '@boostercloud/framework-core'
import { QueryInfo, QueryInput, UserEnvelope, UUID } from '@boostercloud/framework-types'
import { Cart } from '../entities/cart'
import {
beforeHookQueryID,
beforeHookQueryMultiply,
queryHandlerErrorCartId,
queryHandlerErrorCartMessage,
} from '../constants'

@Query({
authorize: 'all',
})
export class CartTotalQuantity {
public constructor(readonly cartId: UUID, @NonExposed readonly multiply: number) {}

public static async handle(query: CartTotalQuantity, queryInfo: QueryInfo): Promise<number> {
const cart = await Booster.entity(Cart, query.cartId)
if (!cart || !cart.cartItems || cart.cartItems.length === 0) {
return 0
}
return cart?.cartItems
.map((cartItem) => cartItem.quantity)
.reduce((accumulator, value) => {
return accumulator + value
}, 0)
}
}

Queries naming convention

We recommend use the Query suffix in your queries name.

Despite you can place your queries in any directory, we strongly recommend you to put them in <project-root>/src/queries.

<project-root>
├── src
│   ├── commands
│   ├── common
│   ├── config
│   ├── entities
│   ├── read-models
│   ├── events
│   ├── queries <------ put them here
│   └── index.ts

Creating a query

The preferred way to create a query is by using the generator, e.g.

boost new:query ItemsInCountry --fields country:string

The generator will create a Typescript class under the queries directory <project-root>/src/queries/items-in-country.ts.

Queries classes can also be created by hand and there are no restrictions. The structure of the data is totally open and can be as complex as you can manage in your projection functions.

The query handler function

Each query class must have a method called handle. This function is the command handler, and it will be called by the framework every time one instance of this query is submitted. Inside the handler you can run validations, return errors and query entities to make decisions.

Handler function receive a QueryInfo object to let users interact with the execution context. It can be used for a variety of purposes, including:

  • Access the current signed in user, their roles and other claims included in their JWT token
  • Access the request context or alter the HTTP response headers

Validating data

Booster uses the typed nature of GraphQL to ensure that types are correct before reaching the handler, so you don't have to validate types.

Throw an error

There are still business rules to be checked before proceeding with a query. For example, a given number must be between a threshold or a string must match a regular expression. In that case, it is enough just to throw an error in the handler. Booster will use the error's message as the response to make it descriptive.

Registering events

Within the query handler execution, it is not possible to register domain events. If you need to register events, then use a Command. For more details about events and the register parameter, see the Events section.

Authorizing queries

You can define who is authorized to access your queries. The Booster authorization feature is covered in the auth section. So far, we have seen that you can make a query publicly accessible by authorizing 'all' to query it, or you can set specific roles providing an array of roles in this way: authorize: [Admin].

Querying

For every query, Booster automatically creates the corresponding GraphQL query. For example, given this CartTotalQuantityQuery:

@Query({
authorize: 'all',
})
export class CartTotalQuantityQuery {
public constructor(readonly cartId: UUID) {}

public static async handle(query: CartTotalQuantity, queryInfo: QueryInfo): Promise<number> {
const cart = await Booster.entity(Cart, query.cartId)
if (!cart || !cart.cartItems || cart.cartItems.length === 0) {
return 0
}
return cart?.cartItems
.map((cartItem) => cartItem.quantity)
.reduce((accumulator, value) => {
return accumulator + value
}, 0)
}
}

You will get the following GraphQL query and subscriptions:

query CartTotalQuantityQuery($cartId: ID!): Float!

[!NOTE] Query subscriptions are not supported yet

+ \ No newline at end of file diff --git a/architecture/read-model/index.html b/architecture/read-model/index.html index 0678cec37..03c09ca60 100644 --- a/architecture/read-model/index.html +++ b/architecture/read-model/index.html @@ -6,13 +6,13 @@ Read model | Booster Framework - +
-
Skip to main content

Read model

A read model contains the data of your application that is exposed to the client through the GraphQL API. It's a projection of one or more entities, so you dont have to directly expose them to the client. Booster generates the GraphQL queries that allow you to fetch your read models.

In other words, Read Models are cached data optimized for read operations. They're updated reactively when Entities are updated after reducing events.

Creating a read model

The Booster CLI will help you to create new read models. You just need to run the following command and the CLI will generate all the boilerplate for you:

boost new:read-model CartReadModel --fields id:UUID cartItems:"Array<CartItem>" paid:boolean --projects Cart

This will generate a new file called cart-read-model.ts in the src/read-models directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI.

Declaring a read model

In Booster, a read model is a class decorated with the @ReadModel decorator. The properties of the class are the fields of the read model. The following example shows a read model with two fields:

@ReadModel
export class ReadModelName {
public constructor(readonly fieldA: SomeType, readonly fieldB: SomeType /* as many fields as needed */) {}
}
info

The ReadModelName class name will be used as the read model name in the GraphQL schema. Also, the types on the constructor will be used to generate the GraphQL schema. For example, if you have a property of type Array<CartItem> the GraphQL schema will know that is an array of CartItem objects.

The projection function

The projection function is a static method decorated with the @Projects decorator. It is used to define how the read model is updated when an entity is modified. he projection function must return a new instance of the read model, it receives two arguments:

  • entity: The entity that has been modified
  • current?: The current read model instance. If it's the first time the read model is created, this argument will be undefined

You must provide the @Projects decorator with an entity class and the join key. The join key is the name of the field in the entity that is used to match it with the read model's id field. In the example below, we are using the id field of the Cart entity to match it with the CartReadModel read model.

@ReadModel
export class CartReadModel {
public constructor(readonly id: UUID, readonly cartItems: Array<CartItem>, readonly paid: boolean) {}

@Projects(Cart, 'id')
public static projectCart(entity: Cart, currentCartReadModel?: CartReadModel): CartReadModel {
return new CartReadModel(entity.id, entity.cartItems, entity.paid)
}
}

Projecting multiple entities

You are able to project multiple entities into the same read model. For example, you can have a UserReadModel that projects both the User entity and the Post entity. In this case, the join key will be different for each entity:

@ReadModel
export class UserReadModel {
public constructor(readonly username: string /* ...(other interesting fields from users)... */) {}

@Projects(User, 'id')
public static projectUser(entity: User, current?: UserReadModel): ProjectionResult<UserReadModel> {
// Here we update the user fields
}

@Projects(Post, 'ownerId')
public static projectUserPost(entity: Post, current?: UserReadModel): ProjectionResult<UserReadModel> {
//Here we can adapt the read model to show specific user information related with the Post entity
}
}

Advanced join keys

There might be cases where you need to project an entity into a read model using a more complex join key. For that reason, Booster supports other types of join keys.

Array of entities

You can use an array of entities as a join key. For example, if you have a Group entity with an array of users in that group (users: Array<UUID>), you can have the following to update each UserReadModel accordingly:

  @Projects(Group, 'users')
public static projectUserGroup(entity: Group, readModelID: UUID, current?: UserReadModel): ProjectionResult<UserReadModel> {
//Here we can update the read models with group information
//This logic will be executed for each read model id in the array
}

You can even select arrays of UUIDs as joinKey. Booster get each value on the array, find a read model with that id and execute the projection function. The signature of the projection function is a bit different in this case. It receives the readModelID as the second argument, which is the id we are projecting from the array. The third argument is the current read model instance, which will be undefined if it's the first time the read model is created. For example, if we have a Group with an array of users in that group (users: Array<UUID>), we can have the following to update each UserReadModel accordingly:

  @Projects(Group, 'users')
public static projectUserGroup(entity: Group, readModelID: UUID, current?: UserReadModel): ProjectionResult<UserReadModel> {
//Here we can update the read models with group information
//This logic will be executed for each read model id in the array
}

Returning special values

Projections usually return a new instance of the read model. However, there are some special cases where you may want to return a different value.

Deleting read models

One of the most common cases is when you want to delete a read model. For example, if you have a UserReadModel that projects the User entity, you may want to delete the read model when the user is deleted. In this case you can return the ReadModelAction.Delete value:

@ReadModel
export class UserReadModel {
public constructor(readonly username: string, /* ...(other interesting fields from users)... */) {}

@Projects(User, 'id')
public static projectUser(entity: User, current?: UserReadModel): ProjectionResult<UserReadModel> {
if (current?.deleted) {
return ReadModelAction.Delete
}
return new UserReadModel(...)
}
info

Deleting a read model is a very expensive operation. It will trigger a write operation in the read model store. If you can, try to avoid deleting read models.

Keeping read models untouched

Another common case is when you want to keep the read model untouched. For example, if you have a UserReadModel that projects the User entity, you may want to keep the read model untouched there are no releveant changes to your read model. In this case you can return the ReadModelAction.Nothing value:

@ReadModel
export class UserReadModel {
public constructor(readonly username: string, /* ...(other interesting fields from users)... */) {}

@Projects(User, 'id')
public static projectUser(entity: User, current?: UserReadModel): ProjectionResult<UserReadModel> {
if (!current?.modified) {
return ReadModelAction.Nothing
}
return new UserReadModel(...)
}
info

Keeping the read model untouched higly recommended in favour of returning a new instance of the read model with the same data. This will not only prevent a new write operation in the database, making your application more efficient. It will also prevent an unnecessary update to be dispatched to any GrahpQL clients subscribed to that read model.

Nested queries and calculated values using getters

You can use TypeScript getters in your read models to allow nested queries and/or return calculated values. You can write arbitrary code in a getter, but you will tipically query for related read model objects or generate a value computed based on the current read model instance or context. This greatly improves the potential of customizing your read model responses.

Here's an example of a getter in the UserReadModel class that returns all PostReadModels that belong to a specific UserReadModel:

@ReadModel
export class UserReadModel {
public constructor(readonly id: UUID, readonly name: string, private postIds: UUID[]) {}

public get posts(): Promise<PostReadModel[]> {
return this.postIds.map((postId) => Booster.readModel(PostReadModel)
.filter({
id: { eq: postId }
})
.search()
}

@Projects(User, 'id')
public static projectUser(entity: User, current?: UserReadModel): ProjectionResult<UserReadModel> {
return new UserReadModel(entity.id, entity.name, entity.postIds)
}
}

As you can see, the getter posts uses the Booster.readModel(PostReadModel) method and filters it by the ids of the posts saved in the postIds private property. This allows you to retrieve all the PostReadModels that belong to a specific UserReadModel and include them as part of the GraphQL response.

Also, you can see here a simple example of a getter called currentTime that returns the timestamp at the moment of the request:

public get currentTime(): Date {
return new Date()
}

With the getters in place, your GraphQL API will start exposing the getters as regular fields and you will be able to transparently read them as follows:

query {
user(id: "123") {
id
name
currentTime
posts {
id
title
content
}
}
}

And here is an example of the corresponding JSON response when this query is executed:

{
"data": {
"user": {
"id": "123",
"name": "John Doe",
"currentTime": "2022-09-20T18:30:00.000Z",
"posts": [
{
"id": "1",
"title": "My first post",
"content": "This is the content of my first post"
},
{
"id": "2",
"title": "My second post",
"content": "This is the content of my second post"
}
]
}
}
}

Notice that getters are not cached in the read models database, so the getters will be executed every time you include these fields in the queries. If access to nested queries is frequent or the size of the responses are big, you could improe your API response performance by querying the read models separately and joining the results in the client application.

Authorizing a read model

Read models are part of the public API of a Booster application, so you can define who is authorized to submit them. All read models are protected by default, which means that no one can query them. In order to allow users to query a read model, you must explicitly authorize them. You can use the authorize field of the @ReadModel decorator to specify the authorization rule.

src/read-model/product-read-model.ts
@ReadModel({
authorize: 'all',
})
export class ProductReadModel {
public constructor(public id: UUID, readonly name: string, readonly description: string, readonly price: number) {}

@Projects(Product, 'id')
public static projectProduct(entity: Product, current?: ProductReadModel): ProjectionResult<ProductReadModel> {
return new ProductReadModel(entity.id, entity.name, entity.description, entity.price)
}
}

You can read more about this on the Authorization section.

Querying a read model

Booster read models are accessible to the outside world through GraphQL queries. GrahpQL fits very well with Booster's CQRS approach because it has two kinds of reading operations: Queries and Subscriptions. They are read-only operations that do not modify the state of the application. Booster uses them to fetch data from the read models.

Booster automatically creates the queries and subscriptions for each read model. You can use them to fetch the data from the read models. For example, given the following read model:

src/read-model/cart-read-model.ts
@ReadModel({
authorize: 'all',
})
export class CartReadModel {
public constructor(public id: UUID, readonly items: Array<CartItem>) {}

@Projects(Cart, 'id')
public static projectCart(entity: Cart, currentReadModel: CartReadModel): ProjectionResult<CartReadModel> {
return new CartReadModel(entity.id, entity.items)
}
}

You will get the following GraphQL query and subscriptions:

query CartReadModel(id: ID!): CartReadModel
subscription CartReadModel(id: ID!): CartReadModel
subscription CartReadModels(id: UUIDPropertyFilter!): CartReadModel

For more information about queries and how to use them, please check the GraphQL API section.

Filtering a read model

Booster GraphQL API provides support for filtering Read Models on queries and subscriptions. To get more information about it go to the GraphQL API section.

Subscribing to a read model

Booster GraphQL API also provides support for real-time updates using subscriptions and a web-socket. To get more information about it go to the GraphQL API section.

Sorting Read Models

There are some cases when it's desirable to query your read models sorted a particular field. An example could be a chat app where you want to fetch the messages of a channel sorted by the time they were sent. Booster provides a special decorator to tag a specific property as a sequence key for a read model:

src/read-model/message-read-model.ts
export class MessageReadModel {
public constructor(
readonly id: UUID, // A channel ID
@sequencedBy readonly timestamp: string,
readonly contents: string
)

@Projects(Message, 'id')
public static projectMessage(
entity: Message,
currentReadModel: MessageReadModel
): ProjectionResult<MessageReadModel> {
return new MessageReadModel(entity.id, entity.timestamp, entity.contents)
}
}

Querying time sequences

Adding a sequence key to a read model changes the behavior of the singular query, which now accepts the sequence key as an optional parameter:

query MessageReadModel(id: ID!, timestamp: string): [MessageReadModel]

Using this query, when only the id is provided, you get an array of all the messages in the channel sorted by timestamp in ascending order (from older to newer). When you also provide an specific timestamp, you still get an array, but it will only contain the message sent in that exact moment.

It is important to guarantee that the sequence key is unique for each message. This could be difficult to achieve if you are using a timestamp as the sequence key. Booster provides a utility function to generate unique timestamps that you can use in your read models: TimeKey.generate(). It will generate a timestamp with a random UUID as a suffix to avoid any coincidences.

For more information about queries and how to use them, please check the GraphQL API section.

Read models naming convention

As it has been previously commented, semantics plays an important role in designing a coherent system and your application should reflect your domain concepts, we recommend choosing a representative domain name and use the ReadModel suffix in your read models name.

Despite you can place your read models in any directory, we strongly recommend you to put them in <project-root>/src/read-models. Having all the read models in one place will help you to understand your application's capabilities at a glance.

<project-root>
├── src
│   ├── commands
│   ├── common
│   ├── config
│   ├── entities
│   ├── read-models <------ put them here
│   ├── events
│   ├── index.ts
│   └── read-models
- +
Skip to main content

Read model

A read model contains the data of your application that is exposed to the client through the GraphQL API. It's a projection of one or more entities, so you dont have to directly expose them to the client. Booster generates the GraphQL queries that allow you to fetch your read models.

In other words, Read Models are cached data optimized for read operations. They're updated reactively when Entities are updated after reducing events.

Creating a read model

The Booster CLI will help you to create new read models. You just need to run the following command and the CLI will generate all the boilerplate for you:

boost new:read-model CartReadModel --fields id:UUID cartItems:"Array<CartItem>" paid:boolean --projects Cart

This will generate a new file called cart-read-model.ts in the src/read-models directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI.

Declaring a read model

In Booster, a read model is a class decorated with the @ReadModel decorator. The properties of the class are the fields of the read model. The following example shows a read model with two fields:

@ReadModel
export class ReadModelName {
public constructor(readonly fieldA: SomeType, readonly fieldB: SomeType /* as many fields as needed */) {}
}
info

The ReadModelName class name will be used as the read model name in the GraphQL schema. Also, the types on the constructor will be used to generate the GraphQL schema. For example, if you have a property of type Array<CartItem> the GraphQL schema will know that is an array of CartItem objects.

The projection function

The projection function is a static method decorated with the @Projects decorator. It is used to define how the read model is updated when an entity is modified. he projection function must return a new instance of the read model, it receives two arguments:

  • entity: The entity that has been modified
  • current?: The current read model instance. If it's the first time the read model is created, this argument will be undefined

You must provide the @Projects decorator with an entity class and the join key. The join key is the name of the field in the entity that is used to match it with the read model's id field. In the example below, we are using the id field of the Cart entity to match it with the CartReadModel read model.

@ReadModel
export class CartReadModel {
public constructor(readonly id: UUID, readonly cartItems: Array<CartItem>, readonly paid: boolean) {}

@Projects(Cart, 'id')
public static projectCart(entity: Cart, currentCartReadModel?: CartReadModel): CartReadModel {
return new CartReadModel(entity.id, entity.cartItems, entity.paid)
}
}

Projecting multiple entities

You are able to project multiple entities into the same read model. For example, you can have a UserReadModel that projects both the User entity and the Post entity. In this case, the join key will be different for each entity:

@ReadModel
export class UserReadModel {
public constructor(readonly username: string /* ...(other interesting fields from users)... */) {}

@Projects(User, 'id')
public static projectUser(entity: User, current?: UserReadModel): ProjectionResult<UserReadModel> {
// Here we update the user fields
}

@Projects(Post, 'ownerId')
public static projectUserPost(entity: Post, current?: UserReadModel): ProjectionResult<UserReadModel> {
//Here we can adapt the read model to show specific user information related with the Post entity
}
}

Advanced join keys

There might be cases where you need to project an entity into a read model using a more complex join key. For that reason, Booster supports other types of join keys.

Array of entities

You can use an array of entities as a join key. For example, if you have a Group entity with an array of users in that group (users: Array<UUID>), you can have the following to update each UserReadModel accordingly:

  @Projects(Group, 'users')
public static projectUserGroup(entity: Group, readModelID: UUID, current?: UserReadModel): ProjectionResult<UserReadModel> {
//Here we can update the read models with group information
//This logic will be executed for each read model id in the array
}

You can even select arrays of UUIDs as joinKey. Booster get each value on the array, find a read model with that id and execute the projection function. The signature of the projection function is a bit different in this case. It receives the readModelID as the second argument, which is the id we are projecting from the array. The third argument is the current read model instance, which will be undefined if it's the first time the read model is created. For example, if we have a Group with an array of users in that group (users: Array<UUID>), we can have the following to update each UserReadModel accordingly:

  @Projects(Group, 'users')
public static projectUserGroup(entity: Group, readModelID: UUID, current?: UserReadModel): ProjectionResult<UserReadModel> {
//Here we can update the read models with group information
//This logic will be executed for each read model id in the array
}

Returning special values

Projections usually return a new instance of the read model. However, there are some special cases where you may want to return a different value.

Deleting read models

One of the most common cases is when you want to delete a read model. For example, if you have a UserReadModel that projects the User entity, you may want to delete the read model when the user is deleted. In this case you can return the ReadModelAction.Delete value:

@ReadModel
export class UserReadModel {
public constructor(readonly username: string, /* ...(other interesting fields from users)... */) {}

@Projects(User, 'id')
public static projectUser(entity: User, current?: UserReadModel): ProjectionResult<UserReadModel> {
if (current?.deleted) {
return ReadModelAction.Delete
}
return new UserReadModel(...)
}
info

Deleting a read model is a very expensive operation. It will trigger a write operation in the read model store. If you can, try to avoid deleting read models.

Keeping read models untouched

Another common case is when you want to keep the read model untouched. For example, if you have a UserReadModel that projects the User entity, you may want to keep the read model untouched there are no releveant changes to your read model. In this case you can return the ReadModelAction.Nothing value:

@ReadModel
export class UserReadModel {
public constructor(readonly username: string, /* ...(other interesting fields from users)... */) {}

@Projects(User, 'id')
public static projectUser(entity: User, current?: UserReadModel): ProjectionResult<UserReadModel> {
if (!current?.modified) {
return ReadModelAction.Nothing
}
return new UserReadModel(...)
}
info

Keeping the read model untouched higly recommended in favour of returning a new instance of the read model with the same data. This will not only prevent a new write operation in the database, making your application more efficient. It will also prevent an unnecessary update to be dispatched to any GrahpQL clients subscribed to that read model.

Nested queries and calculated values using getters

You can use TypeScript getters in your read models to allow nested queries and/or return calculated values. You can write arbitrary code in a getter, but you will tipically query for related read model objects or generate a value computed based on the current read model instance or context. This greatly improves the potential of customizing your read model responses.

Here's an example of a getter in the UserReadModel class that returns all PostReadModels that belong to a specific UserReadModel:

@ReadModel
export class UserReadModel {
public constructor(readonly id: UUID, readonly name: string, private postIds: UUID[]) {}

public get posts(): Promise<PostReadModel[]> {
return this.postIds.map((postId) => Booster.readModel(PostReadModel)
.filter({
id: { eq: postId }
})
.search()
}

@Projects(User, 'id')
public static projectUser(entity: User, current?: UserReadModel): ProjectionResult<UserReadModel> {
return new UserReadModel(entity.id, entity.name, entity.postIds)
}
}

As you can see, the getter posts uses the Booster.readModel(PostReadModel) method and filters it by the ids of the posts saved in the postIds private property. This allows you to retrieve all the PostReadModels that belong to a specific UserReadModel and include them as part of the GraphQL response.

Also, you can see here a simple example of a getter called currentTime that returns the timestamp at the moment of the request:

public get currentTime(): Date {
return new Date()
}

With the getters in place, your GraphQL API will start exposing the getters as regular fields and you will be able to transparently read them as follows:

query {
user(id: "123") {
id
name
currentTime
posts {
id
title
content
}
}
}

And here is an example of the corresponding JSON response when this query is executed:

{
"data": {
"user": {
"id": "123",
"name": "John Doe",
"currentTime": "2022-09-20T18:30:00.000Z",
"posts": [
{
"id": "1",
"title": "My first post",
"content": "This is the content of my first post"
},
{
"id": "2",
"title": "My second post",
"content": "This is the content of my second post"
}
]
}
}
}

Notice that getters are not cached in the read models database, so the getters will be executed every time you include these fields in the queries. If access to nested queries is frequent or the size of the responses are big, you could improe your API response performance by querying the read models separately and joining the results in the client application.

Authorizing a read model

Read models are part of the public API of a Booster application, so you can define who is authorized to submit them. All read models are protected by default, which means that no one can query them. In order to allow users to query a read model, you must explicitly authorize them. You can use the authorize field of the @ReadModel decorator to specify the authorization rule.

src/read-model/product-read-model.ts
@ReadModel({
authorize: 'all',
})
export class ProductReadModel {
public constructor(public id: UUID, readonly name: string, readonly description: string, readonly price: number) {}

@Projects(Product, 'id')
public static projectProduct(entity: Product, current?: ProductReadModel): ProjectionResult<ProductReadModel> {
return new ProductReadModel(entity.id, entity.name, entity.description, entity.price)
}
}

You can read more about this on the Authorization section.

Querying a read model

Booster read models are accessible to the outside world through GraphQL queries. GrahpQL fits very well with Booster's CQRS approach because it has two kinds of reading operations: Queries and Subscriptions. They are read-only operations that do not modify the state of the application. Booster uses them to fetch data from the read models.

Booster automatically creates the queries and subscriptions for each read model. You can use them to fetch the data from the read models. For example, given the following read model:

src/read-model/cart-read-model.ts
@ReadModel({
authorize: 'all',
})
export class CartReadModel {
public constructor(public id: UUID, readonly items: Array<CartItem>) {}

@Projects(Cart, 'id')
public static projectCart(entity: Cart, currentReadModel: CartReadModel): ProjectionResult<CartReadModel> {
return new CartReadModel(entity.id, entity.items)
}
}

You will get the following GraphQL query and subscriptions:

query CartReadModel(id: ID!): CartReadModel
subscription CartReadModel(id: ID!): CartReadModel
subscription CartReadModels(id: UUIDPropertyFilter!): CartReadModel

For more information about queries and how to use them, please check the GraphQL API section.

Filtering a read model

Booster GraphQL API provides support for filtering Read Models on queries and subscriptions. To get more information about it go to the GraphQL API section.

Subscribing to a read model

Booster GraphQL API also provides support for real-time updates using subscriptions and a web-socket. To get more information about it go to the GraphQL API section.

Sorting Read Models

There are some cases when it's desirable to query your read models sorted a particular field. An example could be a chat app where you want to fetch the messages of a channel sorted by the time they were sent. Booster provides a special decorator to tag a specific property as a sequence key for a read model:

src/read-model/message-read-model.ts
export class MessageReadModel {
public constructor(
readonly id: UUID, // A channel ID
@sequencedBy readonly timestamp: string,
readonly contents: string
)

@Projects(Message, 'id')
public static projectMessage(
entity: Message,
currentReadModel: MessageReadModel
): ProjectionResult<MessageReadModel> {
return new MessageReadModel(entity.id, entity.timestamp, entity.contents)
}
}

Querying time sequences

Adding a sequence key to a read model changes the behavior of the singular query, which now accepts the sequence key as an optional parameter:

query MessageReadModel(id: ID!, timestamp: string): [MessageReadModel]

Using this query, when only the id is provided, you get an array of all the messages in the channel sorted by timestamp in ascending order (from older to newer). When you also provide an specific timestamp, you still get an array, but it will only contain the message sent in that exact moment.

It is important to guarantee that the sequence key is unique for each message. This could be difficult to achieve if you are using a timestamp as the sequence key. Booster provides a utility function to generate unique timestamps that you can use in your read models: TimeKey.generate(). It will generate a timestamp with a random UUID as a suffix to avoid any coincidences.

For more information about queries and how to use them, please check the GraphQL API section.

Read models naming convention

As it has been previously commented, semantics plays an important role in designing a coherent system and your application should reflect your domain concepts, we recommend choosing a representative domain name and use the ReadModel suffix in your read models name.

Despite you can place your read models in any directory, we strongly recommend you to put them in <project-root>/src/read-models. Having all the read models in one place will help you to understand your application's capabilities at a glance.

<project-root>
├── src
│   ├── commands
│   ├── common
│   ├── config
│   ├── entities
│   ├── read-models <------ put them here
│   ├── events
│   ├── index.ts
│   └── read-models
+ \ No newline at end of file diff --git a/assets/js/021264af.b64efc7e.js b/assets/js/021264af.b64efc7e.js new file mode 100644 index 000000000..29ae5bc50 --- /dev/null +++ b/assets/js/021264af.b64efc7e.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[5033],{3905:(e,t,n)=>{n.d(t,{Zo:()=>d,kt:()=>h});var r=n(7294);function s(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function i(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var r=Object.getOwnPropertySymbols(e);t&&(r=r.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,r)}return n}function o(e){for(var t=1;t=0||(s[n]=e[n]);return s}(e,t);if(Object.getOwnPropertySymbols){var i=Object.getOwnPropertySymbols(e);for(r=0;r=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(s[n]=e[n])}return s}var l=r.createContext({}),c=function(e){var t=r.useContext(l),n=t;return e&&(n="function"==typeof e?e(t):o(o({},t),e)),n},d=function(e){var t=c(e.components);return r.createElement(l.Provider,{value:t},e.children)},p={inlineCode:"code",wrapper:function(e){var t=e.children;return r.createElement(r.Fragment,{},t)}},u=r.forwardRef((function(e,t){var n=e.components,s=e.mdxType,i=e.originalType,l=e.parentName,d=a(e,["components","mdxType","originalType","parentName"]),u=c(n),h=s,m=u["".concat(l,".").concat(h)]||u[h]||p[h]||i;return n?r.createElement(m,o(o({ref:t},d),{},{components:n})):r.createElement(m,o({ref:t},d))}));function h(e,t){var n=arguments,s=t&&t.mdxType;if("string"==typeof e||s){var i=n.length,o=new Array(i);o[0]=u;var a={};for(var l in t)hasOwnProperty.call(t,l)&&(a[l]=t[l]);a.originalType=e,a.mdxType="string"==typeof e?e:s,o[1]=a;for(var c=2;c{n.r(t),n.d(t,{assets:()=>l,contentTitle:()=>o,default:()=>p,frontMatter:()=>i,metadata:()=>a,toc:()=>c});var r=n(7462),s=(n(7294),n(3905));const i={},o="Advanced uses of the Register object",a={unversionedId:"going-deeper/register",id:"going-deeper/register",title:"Advanced uses of the Register object",description:"The Register object is a built-in object that is automatically injected by the framework into all command or event handlers to let users interact with the execution context. It can be used for a variety of purposes, including:",source:"@site/docs/10_going-deeper/register.mdx",sourceDirName:"10_going-deeper",slug:"/going-deeper/register",permalink:"/going-deeper/register",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/10_going-deeper/register.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",frontMatter:{},sidebar:"docs",previous:{title:"Environments",permalink:"/going-deeper/environment-configuration"},next:{title:"Configuring Infrastructure Providers",permalink:"/going-deeper/infrastructure-providers"}},l={},c=[{value:"Registering events",id:"registering-events",level:2},{value:"Manually flush the events",id:"manually-flush-the-events",level:2},{value:"Access the current signed in user",id:"access-the-current-signed-in-user",level:2},{value:"Command-specific features",id:"command-specific-features",level:2},{value:"Access the request context",id:"access-the-request-context",level:3},{value:"Alter the HTTP response headers",id:"alter-the-http-response-headers",level:3}],d={toc:c};function p(e){let{components:t,...n}=e;return(0,s.kt)("wrapper",(0,r.Z)({},d,n,{components:t,mdxType:"MDXLayout"}),(0,s.kt)("h1",{id:"advanced-uses-of-the-register-object"},"Advanced uses of the Register object"),(0,s.kt)("p",null,"The Register object is a built-in object that is automatically injected by the framework into all command or event handlers to let users interact with the execution context. It can be used for a variety of purposes, including:"),(0,s.kt)("ul",null,(0,s.kt)("li",{parentName:"ul"},"Registering events to be emitted at the end of the command or event handler"),(0,s.kt)("li",{parentName:"ul"},"Manually flush the events to be persisted synchronously to the event store"),(0,s.kt)("li",{parentName:"ul"},"Access the current signed in user, their roles and other claims included in their JWT token"),(0,s.kt)("li",{parentName:"ul"},"In a command: Access the request context or alter the HTTP response headers")),(0,s.kt)("h2",{id:"registering-events"},"Registering events"),(0,s.kt)("p",null,"When handling a command or event, you can use the Register object to register one or more events that will be emitted when the command or event handler is completed. Events are registered using the ",(0,s.kt)("inlineCode",{parentName:"p"},"register.events()")," method, which takes one or more events as arguments. For example:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"public async handle(register: Register): Promise {\n // Do some work...\n register.events(new OrderConfirmed(this.orderID))\n // Do more work...\n}\n")),(0,s.kt)("p",null,"In this example, we're registering an OrderConfirmed event to be persisted to the event store when the handler finishes. You can also register multiple events by passing them as separate arguments to the register.events() method:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"public async handle(register: Register): Promise {\n // Do some work...\n register.events(\n new OrderConfirmed(this.orderID),\n new OrderShipped(this.orderID)\n )\n // Do more work...\n}\n")),(0,s.kt)("p",null,"It's worth noting that events registered with ",(0,s.kt)("inlineCode",{parentName:"p"},"register.events()")," aren't immediately persisted to the event store. Instead, they're stored in memory until the command or event handler finishes executing. To force the events to be persisted immediately, you can call the ",(0,s.kt)("inlineCode",{parentName:"p"},"register.flush()")," method that is described in the next section."),(0,s.kt)("h2",{id:"manually-flush-the-events"},"Manually flush the events"),(0,s.kt)("p",null,"As mentioned in the previous section, events registered with ",(0,s.kt)("inlineCode",{parentName:"p"},"register.events()")," aren't immediately persisted to the event store. Instead, they're stored in memory until the command or event handler finishes its execution, but this doesn't work in all situations, sometimes it's useful to store partial updates of a longer process, and some scenarios could accept partial successes. To force the events to be persisted and wait for the database to confirm the write, you can use the ",(0,s.kt)("inlineCode",{parentName:"p"},"register.flush()")," method."),(0,s.kt)("p",null,"The ",(0,s.kt)("inlineCode",{parentName:"p"},"register.flush()")," method takes no arguments and returns a promise that resolves when the events have been successfully persisted to the event store. For example:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"public async handle(register: Register): Promise {\n // Do some work...\n register.events(new OrderConfirmed(this.orderID))\n await register.flush()\n const mailID = await sendConfirmationEmail(this.orderID)\n register.events(new MailSent(this.orderID, mailID))\n // Do more work...\n}\n")),(0,s.kt)("p",null,"In this example, we're calling ",(0,s.kt)("inlineCode",{parentName:"p"},"register.flush()")," after registering an ",(0,s.kt)("inlineCode",{parentName:"p"},"OrderConfirmed")," event to ensure that it's persisted to the event store before continuing with the rest of the handler logic. In this way, even if an error happens while sending the confirmation email, the order will be persisted."),(0,s.kt)("h2",{id:"access-the-current-signed-in-user"},"Access the current signed in user"),(0,s.kt)("p",null,"When handling a command or event, you can use the injected ",(0,s.kt)("inlineCode",{parentName:"p"},"Register")," object to access the currently signed-in user as well as any metadata included in their JWT token like their roles or other claims (the specific claims will depend on the specific auth provider used). To do this, you can use the ",(0,s.kt)("inlineCode",{parentName:"p"},"currentUser")," property. This property is an instance of the ",(0,s.kt)("inlineCode",{parentName:"p"},"UserEnvelope")," class, which has the following properties:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"export interface UserEnvelope {\n id?: string // An optional identifier of the user\n username: string // The unique username of the current user\n roles: Array // The list of role names assigned to this user\n claims: Record // An object containing the claims included in the body of the JWT token\n header?: Record // An object containing the headers of the JWT token for further verification\n}\n")),(0,s.kt)("p",null,"For example, to access the username of the currently signed-in user, you can use the ",(0,s.kt)("inlineCode",{parentName:"p"},"currentUser.username")," property:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"public async handle(register: Register): Promise {\n console.log(`The currently signed-in user is ${register.currentUser?.username}`)\n}\n\n// Output: The currently signed-in user is john.doe\n")),(0,s.kt)("h2",{id:"command-specific-features"},"Command-specific features"),(0,s.kt)("p",null,"The command handlers are executed as part of a GraphQL mutation request, so they have access to a few additional features that are specific to commands that can be used to access the request context or alter the HTTP response headers."),(0,s.kt)("h3",{id:"access-the-request-context"},"Access the request context"),(0,s.kt)("p",null,"The request context is injected in the command handler as part of the register command and you can access it using the ",(0,s.kt)("inlineCode",{parentName:"p"},"context")," property. This property is an instance of the ",(0,s.kt)("inlineCode",{parentName:"p"},"ContextEnvelope")," interface, which has the following properties:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"export interface ContextEnvelope {\n /** Decoded request header and body */\n request: {\n headers: unknown\n body: unknown\n }\n /** Provider-dependent raw request context object */\n rawContext: unknown\n}\n")),(0,s.kt)("p",null,"The ",(0,s.kt)("inlineCode",{parentName:"p"},"request")," property exposes a normalized version of the request headers and body that can be used regardless the provider. We recommend using this property instead of the ",(0,s.kt)("inlineCode",{parentName:"p"},"rawContext")," property, as it will be more portable across providers."),(0,s.kt)("p",null,"The ",(0,s.kt)("inlineCode",{parentName:"p"},"rawContext")," property exposes the full raw request context as it comes in the original request, so it will depend on the underlying provider used. For instance, in AWS, it will be ",(0,s.kt)("a",{parentName:"p",href:"https://docs.aws.amazon.com/lambda/latest/dg/nodejs-context.html"},"a lambda context object"),", while in Azure it will be ",(0,s.kt)("a",{parentName:"p",href:"https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-node#context-object"},"an Azure Functions context object"),"."),(0,s.kt)("h3",{id:"alter-the-http-response-headers"},"Alter the HTTP response headers"),(0,s.kt)("p",null,"Finally, you can use the ",(0,s.kt)("inlineCode",{parentName:"p"},"responseHeaders")," property to alter the HTTP response headers that will be sent back to the client. This property is a plain Typescript object which is initialized with the default headers. You can add, remove or modify any of the headers by using the standard object methods:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"public async handle(register: Register): Promise {\n register.responseHeaders['X-My-Header'] = 'My custom header'\n register.responseHeaders['X-My-Other-Header'] = 'My other custom header'\n delete register.responseHeaders['X-My-Other-Header']\n}\n")))}p.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/021264af.bdd37aaf.js b/assets/js/021264af.bdd37aaf.js deleted file mode 100644 index 9f45110ec..000000000 --- a/assets/js/021264af.bdd37aaf.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[5033],{3905:(e,t,n)=>{n.d(t,{Zo:()=>d,kt:()=>h});var r=n(7294);function s(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function i(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var r=Object.getOwnPropertySymbols(e);t&&(r=r.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,r)}return n}function o(e){for(var t=1;t=0||(s[n]=e[n]);return s}(e,t);if(Object.getOwnPropertySymbols){var i=Object.getOwnPropertySymbols(e);for(r=0;r=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(s[n]=e[n])}return s}var l=r.createContext({}),c=function(e){var t=r.useContext(l),n=t;return e&&(n="function"==typeof e?e(t):o(o({},t),e)),n},d=function(e){var t=c(e.components);return r.createElement(l.Provider,{value:t},e.children)},p={inlineCode:"code",wrapper:function(e){var t=e.children;return r.createElement(r.Fragment,{},t)}},u=r.forwardRef((function(e,t){var n=e.components,s=e.mdxType,i=e.originalType,l=e.parentName,d=a(e,["components","mdxType","originalType","parentName"]),u=c(n),h=s,m=u["".concat(l,".").concat(h)]||u[h]||p[h]||i;return n?r.createElement(m,o(o({ref:t},d),{},{components:n})):r.createElement(m,o({ref:t},d))}));function h(e,t){var n=arguments,s=t&&t.mdxType;if("string"==typeof e||s){var i=n.length,o=new Array(i);o[0]=u;var a={};for(var l in t)hasOwnProperty.call(t,l)&&(a[l]=t[l]);a.originalType=e,a.mdxType="string"==typeof e?e:s,o[1]=a;for(var c=2;c{n.r(t),n.d(t,{assets:()=>l,contentTitle:()=>o,default:()=>p,frontMatter:()=>i,metadata:()=>a,toc:()=>c});var r=n(7462),s=(n(7294),n(3905));const i={},o="Advanced uses of the Register object",a={unversionedId:"going-deeper/register",id:"going-deeper/register",title:"Advanced uses of the Register object",description:"The Register object is a built-in object that is automatically injected by the framework into all command or event handlers to let users interact with the execution context. It can be used for a variety of purposes, including:",source:"@site/docs/10_going-deeper/register.mdx",sourceDirName:"10_going-deeper",slug:"/going-deeper/register",permalink:"/going-deeper/register",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/10_going-deeper/register.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",frontMatter:{},sidebar:"docs",previous:{title:"Environments",permalink:"/going-deeper/environment-configuration"},next:{title:"Configuring Infrastructure Providers",permalink:"/going-deeper/infrastructure-providers"}},l={},c=[{value:"Registering events",id:"registering-events",level:2},{value:"Manually flush the events",id:"manually-flush-the-events",level:2},{value:"Access the current signed in user",id:"access-the-current-signed-in-user",level:2},{value:"Command-specific features",id:"command-specific-features",level:2},{value:"Access the request context",id:"access-the-request-context",level:3},{value:"Alter the HTTP response headers",id:"alter-the-http-response-headers",level:3}],d={toc:c};function p(e){let{components:t,...n}=e;return(0,s.kt)("wrapper",(0,r.Z)({},d,n,{components:t,mdxType:"MDXLayout"}),(0,s.kt)("h1",{id:"advanced-uses-of-the-register-object"},"Advanced uses of the Register object"),(0,s.kt)("p",null,"The Register object is a built-in object that is automatically injected by the framework into all command or event handlers to let users interact with the execution context. It can be used for a variety of purposes, including:"),(0,s.kt)("ul",null,(0,s.kt)("li",{parentName:"ul"},"Registering events to be emitted at the end of the command or event handler"),(0,s.kt)("li",{parentName:"ul"},"Manually flush the events to be persisted synchronously to the event store"),(0,s.kt)("li",{parentName:"ul"},"Access the current signed in user, their roles and other claims included in their JWT token"),(0,s.kt)("li",{parentName:"ul"},"In a command: Access the request context or alter the HTTP response headers")),(0,s.kt)("h2",{id:"registering-events"},"Registering events"),(0,s.kt)("p",null,"When handling a command or event, you can use the Register object to register one or more events that will be emitted when the command or event handler is completed. Events are registered using the ",(0,s.kt)("inlineCode",{parentName:"p"},"register.events()")," method, which takes one or more events as arguments. For example:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"public async handle(register: Register): Promise {\n // Do some work...\n register.events(new OrderConfirmed(this.orderID))\n // Do more work...\n}\n")),(0,s.kt)("p",null,"In this example, we're registering an OrderConfirmed event to be persisted to the event store when the handler finishes. You can also register multiple events by passing them as separate arguments to the register.events() method:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"public async handle(register: Register): Promise {\n // Do some work...\n register.events(\n new OrderConfirmed(this.orderID),\n new OrderShipped(this.orderID)\n )\n // Do more work...\n}\n")),(0,s.kt)("p",null,"It's worth noting that events registered with ",(0,s.kt)("inlineCode",{parentName:"p"},"register.events()")," aren't immediately persisted to the event store. Instead, they're stored in memory until the command or event handler finishes executing. To force the events to be persisted immediately, you can call the ",(0,s.kt)("inlineCode",{parentName:"p"},"register.flush()")," method that is described in the next section."),(0,s.kt)("h2",{id:"manually-flush-the-events"},"Manually flush the events"),(0,s.kt)("p",null,"As mentioned in the previous section, events registered with ",(0,s.kt)("inlineCode",{parentName:"p"},"register.events()")," aren't immediately persisted to the event store. Instead, they're stored in memory until the command or event handler finishes its execution, but this doesn't work in all situations, sometimes it's useful to store partial updates of a longer process, and some scenarios could accept partial successes. To force the events to be persisted and wait for the database to confirm the write, you can use the ",(0,s.kt)("inlineCode",{parentName:"p"},"register.flush()")," method."),(0,s.kt)("p",null,"The ",(0,s.kt)("inlineCode",{parentName:"p"},"register.flush()")," method takes no arguments and returns a promise that resolves when the events have been successfully persisted to the event store. For example:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"public async handle(register: Register): Promise {\n // Do some work...\n register.events(new OrderConfirmed(this.orderID))\n await register.flush()\n const mailID = await sendConfirmationEmail(this.orderID)\n register.events(new MailSent(this.orderID, mailID))\n // Do more work...\n}\n")),(0,s.kt)("p",null,"In this example, we're calling ",(0,s.kt)("inlineCode",{parentName:"p"},"register.flush()")," after registering an ",(0,s.kt)("inlineCode",{parentName:"p"},"OrderConfirmed")," event to ensure that it's persisted to the event store before continuing with the rest of the handler logic. In this way, even if an error happens while sending the confirmation email, the order will be persisted."),(0,s.kt)("h2",{id:"access-the-current-signed-in-user"},"Access the current signed in user"),(0,s.kt)("p",null,"When handling a command or event, you can use the injected ",(0,s.kt)("inlineCode",{parentName:"p"},"Register")," object to access the currently signed-in user as well as any metadata included in their JWT token like their roles or other claims (the specific claims will depend on the specific auth provider used). To do this, you can use the ",(0,s.kt)("inlineCode",{parentName:"p"},"currentUser")," property. This property is an instance of the ",(0,s.kt)("inlineCode",{parentName:"p"},"UserEnvelope")," class, which has the following properties:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"export interface UserEnvelope {\n id?: string // An optional identifier of the user\n username: string // The unique username of the current user\n roles: Array // The list of role names assigned to this user\n claims: Record // An object containing the claims included in the body of the JWT token\n header?: Record // An object containing the headers of the JWT token for further verification\n}\n")),(0,s.kt)("p",null,"For example, to access the username of the currently signed-in user, you can use the ",(0,s.kt)("inlineCode",{parentName:"p"},"currentUser.username")," property:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"public async handle(register: Register): Promise {\n console.log(`The currently signed-in user is ${register.currentUser?.username}`)\n}\n\n// Output: The currently signed-in user is john.doe\n")),(0,s.kt)("h2",{id:"command-specific-features"},"Command-specific features"),(0,s.kt)("p",null,"The command handlers are executed as part of a GraphQL mutation request, so they have access to a few additional features that are specific to commands that can be used to access the request context or alter the HTTP response headers."),(0,s.kt)("h3",{id:"access-the-request-context"},"Access the request context"),(0,s.kt)("p",null,"The request context is injected in the command handler as part of the register command and you can access it using the ",(0,s.kt)("inlineCode",{parentName:"p"},"context")," property. This property is an instance of the ",(0,s.kt)("inlineCode",{parentName:"p"},"ContextEnvelope")," interface, which has the following properties:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"export interface ContextEnvelope {\n /** Decoded request header and body */\n request: {\n headers: unknown\n body: unknown\n }\n /** Provider-dependent raw request context object */\n rawContext: unknown\n}\n")),(0,s.kt)("p",null,"The ",(0,s.kt)("inlineCode",{parentName:"p"},"request")," property exposes a normalized version of the request headers and body that can be used regardless the provider. We recommend using this property instead of the ",(0,s.kt)("inlineCode",{parentName:"p"},"rawContext")," property, as it will be more portable across providers."),(0,s.kt)("p",null,"The ",(0,s.kt)("inlineCode",{parentName:"p"},"rawContext")," property exposes the full raw request context as it comes in the original request, so it will depend on the underlying provider used. For instance, in AWS, it will be ",(0,s.kt)("a",{parentName:"p",href:"https://docs.aws.amazon.com/lambda/latest/dg/nodejs-context.html"},"a lambda context object"),", while in Azure it will be ",(0,s.kt)("a",{parentName:"p",href:"https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-node#context-object"},"an Azure Functions context object"),"."),(0,s.kt)("h3",{id:"alter-the-http-response-headers"},"Alter the HTTP response headers"),(0,s.kt)("p",null,"Finally, you can use the ",(0,s.kt)("inlineCode",{parentName:"p"},"responseHeaders")," property to alter the HTTP response headers that will be sent back to the client. This property is a plain Typescript object which is initialized with the default headers. You can add, remove or modify any of the headers by using the standard object methods:"),(0,s.kt)("pre",null,(0,s.kt)("code",{parentName:"pre",className:"language-typescript"},"public async handle(register: Register): Promise {\n register.responseHeaders['X-My-Header'] = 'My custom header'\n register.responseHeaders['X-My-Other-Header'] = 'My other custom header'\n delete register.responseHeaders['X-My-Other-Header']\n}\n")))}p.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/0350e44c.1248c8a1.js b/assets/js/0350e44c.1248c8a1.js deleted file mode 100644 index 0d87ea76c..000000000 --- a/assets/js/0350e44c.1248c8a1.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[8946],{3905:(e,t,n)=>{n.d(t,{Zo:()=>c,kt:()=>u});var o=n(7294);function a(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function r(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);t&&(o=o.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,o)}return n}function i(e){for(var t=1;t=0||(a[n]=e[n]);return a}(e,t);if(Object.getOwnPropertySymbols){var r=Object.getOwnPropertySymbols(e);for(o=0;o=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(a[n]=e[n])}return a}var l=o.createContext({}),p=function(e){var t=o.useContext(l),n=t;return e&&(n="function"==typeof e?e(t):i(i({},t),e)),n},c=function(e){var t=p(e.components);return o.createElement(l.Provider,{value:t},e.children)},d={inlineCode:"code",wrapper:function(e){var t=e.children;return o.createElement(o.Fragment,{},t)}},m=o.forwardRef((function(e,t){var n=e.components,a=e.mdxType,r=e.originalType,l=e.parentName,c=s(e,["components","mdxType","originalType","parentName"]),m=p(n),u=a,h=m["".concat(l,".").concat(u)]||m[u]||d[u]||r;return n?o.createElement(h,i(i({ref:t},c),{},{components:n})):o.createElement(h,i({ref:t},c))}));function u(e,t){var n=arguments,a=t&&t.mdxType;if("string"==typeof e||a){var r=n.length,i=new Array(r);i[0]=m;var s={};for(var l in t)hasOwnProperty.call(t,l)&&(s[l]=t[l]);s.originalType=e,s.mdxType="string"==typeof e?e:a,i[1]=s;for(var p=2;p{n.r(t),n.d(t,{assets:()=>l,contentTitle:()=>i,default:()=>d,frontMatter:()=>r,metadata:()=>s,toc:()=>p});var o=n(7462),a=(n(7294),n(3905));const r={},i="Testing",s={unversionedId:"going-deeper/testing",id:"going-deeper/testing",title:"Testing",description:"Booster applications are fully tested by default. This means that you can be sure that your application will work as expected. However, you can also write your own tests to check that your application behaves as you expect. In this section, we will leave some recommendations on how to test your Booster application.",source:"@site/docs/10_going-deeper/testing.md",sourceDirName:"10_going-deeper",slug:"/going-deeper/testing",permalink:"/going-deeper/testing",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/10_going-deeper/testing.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",frontMatter:{},sidebar:"docs",previous:{title:"Webhook Rocket",permalink:"/going-deeper/rockets/rocket-webhook"},next:{title:"Migrations",permalink:"/going-deeper/data-migrations"}},l={},p=[{value:"Testing Booster applications",id:"testing-booster-applications",level:2},{value:"Testing with sinon-chai",id:"testing-with-sinon-chai",level:3},{value:"Recommended files",id:"recommended-files",level:3},{value:"Framework integration tests",id:"framework-integration-tests",level:2}],c={toc:p};function d(e){let{components:t,...n}=e;return(0,a.kt)("wrapper",(0,o.Z)({},c,n,{components:t,mdxType:"MDXLayout"}),(0,a.kt)("h1",{id:"testing"},"Testing"),(0,a.kt)("p",null,"Booster applications are fully tested by default. This means that you can be sure that your application will work as expected. However, you can also write your own tests to check that your application behaves as you expect. In this section, we will leave some recommendations on how to test your Booster application."),(0,a.kt)("h2",{id:"testing-booster-applications"},"Testing Booster applications"),(0,a.kt)("p",null,"To properly test a Booster application, you should create a ",(0,a.kt)("inlineCode",{parentName:"p"},"test")," folder at the same level as the ",(0,a.kt)("inlineCode",{parentName:"p"},"src")," one. Apart from that, tests' names should have the ",(0,a.kt)("inlineCode",{parentName:"p"},".test.ts")," format."),(0,a.kt)("p",null,"When a Booster application is generated, you will have a script in a ",(0,a.kt)("inlineCode",{parentName:"p"},"package.json")," like this:"),(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript"},'"scripts": {\n "test": "nyc --extension .ts mocha --forbid-only \\"test/**/*.test.ts\\""\n}\n')),(0,a.kt)("p",null,"The only thing that you should add to this line are the ",(0,a.kt)("inlineCode",{parentName:"p"},"AWS_SDK_LOAD_CONFIG=true")," and ",(0,a.kt)("inlineCode",{parentName:"p"},"BOOSTER_ENV=test")," environment variables, so the script will look like this:"),(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript"},'"scripts": {\n "test": "AWS_SDK_LOAD_CONFIG=true BOOSTER_ENV=test nyc --extension .ts mocha --forbid-only \\"test/**/*.test.ts\\""\n}\n')),(0,a.kt)("h3",{id:"testing-with-sinon-chai"},"Testing with ",(0,a.kt)("inlineCode",{parentName:"h3"},"sinon-chai")),(0,a.kt)("p",null,"The ",(0,a.kt)("inlineCode",{parentName:"p"},"BoosterConfig")," can be accessed through the ",(0,a.kt)("inlineCode",{parentName:"p"},"Booster.config")," on any part of a Booster application. To properly mock it for your objective, we really recommend to use sinon ",(0,a.kt)("inlineCode",{parentName:"p"},"replace")," method, after configuring your ",(0,a.kt)("inlineCode",{parentName:"p"},"Booster.config")," as desired."),(0,a.kt)("p",null,'In the example below, we add 2 "empty" read-models, since we are iterating ',(0,a.kt)("inlineCode",{parentName:"p"},"Booster.config.readModels")," from a command handler:"),(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript"},"// Test\nimport { replace } from 'sinon'\n\nconst config = new BoosterConfig('test')\nconfig.appName = 'testing-time'\nconfig.providerPackage = '@boostercloud/framework-provider-aws'\nconfig.readModels['WoW'] = {} as ReadModelMetadata\nconfig.readModels['Amazing'] = {} as ReadModelMetadata\nreplace(Booster, 'config', config)\n\nconst spyMyCall = spy(MyCommand, 'myCall')\nconst command = new MyCommand('1', true)\nconst register = new Register('request-id-1')\nconst registerSpy = spy(register, 'events')\nawait MyCommand.handle(command, register)\n\nexpect(spyMyCall).to.have.been.calledOnceWithExactly('WoW')\nexpect(spyMyCall).to.have.been.calledOnceWithExactly('Amazing')\nexpect(registerSpy).to.have.been.calledOnceWithExactly(new MyEvent('1', 'WoW'))\nexpect(registerSpy).to.have.been.calledOnceWithExactly(new MyEvent('1', 'Amazing'))\n")),(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript"},"// Example code\npublic static async handle(command: MyCommand, register: Register): Promise {\n const readModels = Booster.config.readModels\n for (const readModelName in readModels) {\n myCall(readModelName)\n register.events(new MyEvent(command.ID, readModelName))\n }\n}\n")),(0,a.kt)("h3",{id:"recommended-files"},"Recommended files"),(0,a.kt)("p",null,"These are some files that might help you speed up your testing with Booster."),(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript"},"// /test/expect.ts\nimport * as chai from 'chai'\n\nchai.use(require('sinon-chai'))\nchai.use(require('chai-as-promised'))\n\nexport const expect = chai.expect\n")),(0,a.kt)("p",null,"This ",(0,a.kt)("inlineCode",{parentName:"p"},"expect")," method will help you with some more additional methods like ",(0,a.kt)("inlineCode",{parentName:"p"},"expect().to.have.been.calledOnceWithExactly()")),(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-yaml"},"# /.mocharc.yml\ndiff: true\nrequire: 'ts-node/register'\nextension:\n - ts\npackage: './package.json'\nrecursive: true\nreporter: 'spec'\ntimeout: 5000\nfull-trace: true\nbail: true\n")),(0,a.kt)("h2",{id:"framework-integration-tests"},"Framework integration tests"),(0,a.kt)("p",null,"Booster framework integration tests package is used to test the Booster project itself, but it is also an example of how a Booster application could be tested. We encourage developers to have a look at our ",(0,a.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/booster/tree/main/packages/framework-integration-tests"},"Booster project repository"),"."),(0,a.kt)("p",null,"Some integration tests highly depend on the provider chosen for the project, and the infrastructure is normally deployed in the cloud right before the tests run. Once tests are completed, the application is teared down."),(0,a.kt)("p",null,"There are several types of integration tests in this package:"),(0,a.kt)("ul",null,(0,a.kt)("li",{parentName:"ul"},"Tests to ensure that different packages integrate as expected with each other."),(0,a.kt)("li",{parentName:"ul"},"Tests to ensure that a Booster application behaves as expected when it is hit by a client (a GraphQL client)."),(0,a.kt)("li",{parentName:"ul"},"Tests to ensure that the application behaves in the same way no matter what provider is selected.")))}d.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/0350e44c.26b57478.js b/assets/js/0350e44c.26b57478.js new file mode 100644 index 000000000..b155338f4 --- /dev/null +++ b/assets/js/0350e44c.26b57478.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[8946],{3905:(e,t,n)=>{n.d(t,{Zo:()=>c,kt:()=>u});var o=n(7294);function a(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function r(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);t&&(o=o.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,o)}return n}function i(e){for(var t=1;t=0||(a[n]=e[n]);return a}(e,t);if(Object.getOwnPropertySymbols){var r=Object.getOwnPropertySymbols(e);for(o=0;o=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(a[n]=e[n])}return a}var l=o.createContext({}),p=function(e){var t=o.useContext(l),n=t;return e&&(n="function"==typeof e?e(t):i(i({},t),e)),n},c=function(e){var t=p(e.components);return o.createElement(l.Provider,{value:t},e.children)},d={inlineCode:"code",wrapper:function(e){var t=e.children;return o.createElement(o.Fragment,{},t)}},m=o.forwardRef((function(e,t){var n=e.components,a=e.mdxType,r=e.originalType,l=e.parentName,c=s(e,["components","mdxType","originalType","parentName"]),m=p(n),u=a,h=m["".concat(l,".").concat(u)]||m[u]||d[u]||r;return n?o.createElement(h,i(i({ref:t},c),{},{components:n})):o.createElement(h,i({ref:t},c))}));function u(e,t){var n=arguments,a=t&&t.mdxType;if("string"==typeof e||a){var r=n.length,i=new Array(r);i[0]=m;var s={};for(var l in t)hasOwnProperty.call(t,l)&&(s[l]=t[l]);s.originalType=e,s.mdxType="string"==typeof e?e:a,i[1]=s;for(var p=2;p{n.r(t),n.d(t,{assets:()=>l,contentTitle:()=>i,default:()=>d,frontMatter:()=>r,metadata:()=>s,toc:()=>p});var o=n(7462),a=(n(7294),n(3905));const r={},i="Testing",s={unversionedId:"going-deeper/testing",id:"going-deeper/testing",title:"Testing",description:"Booster applications are fully tested by default. This means that you can be sure that your application will work as expected. However, you can also write your own tests to check that your application behaves as you expect. In this section, we will leave some recommendations on how to test your Booster application.",source:"@site/docs/10_going-deeper/testing.md",sourceDirName:"10_going-deeper",slug:"/going-deeper/testing",permalink:"/going-deeper/testing",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/10_going-deeper/testing.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",frontMatter:{},sidebar:"docs",previous:{title:"Webhook Rocket",permalink:"/going-deeper/rockets/rocket-webhook"},next:{title:"Migrations",permalink:"/going-deeper/data-migrations"}},l={},p=[{value:"Testing Booster applications",id:"testing-booster-applications",level:2},{value:"Testing with sinon-chai",id:"testing-with-sinon-chai",level:3},{value:"Recommended files",id:"recommended-files",level:3},{value:"Framework integration tests",id:"framework-integration-tests",level:2}],c={toc:p};function d(e){let{components:t,...n}=e;return(0,a.kt)("wrapper",(0,o.Z)({},c,n,{components:t,mdxType:"MDXLayout"}),(0,a.kt)("h1",{id:"testing"},"Testing"),(0,a.kt)("p",null,"Booster applications are fully tested by default. This means that you can be sure that your application will work as expected. However, you can also write your own tests to check that your application behaves as you expect. In this section, we will leave some recommendations on how to test your Booster application."),(0,a.kt)("h2",{id:"testing-booster-applications"},"Testing Booster applications"),(0,a.kt)("p",null,"To properly test a Booster application, you should create a ",(0,a.kt)("inlineCode",{parentName:"p"},"test")," folder at the same level as the ",(0,a.kt)("inlineCode",{parentName:"p"},"src")," one. Apart from that, tests' names should have the ",(0,a.kt)("inlineCode",{parentName:"p"},".test.ts")," format."),(0,a.kt)("p",null,"When a Booster application is generated, you will have a script in a ",(0,a.kt)("inlineCode",{parentName:"p"},"package.json")," like this:"),(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript"},'"scripts": {\n "test": "nyc --extension .ts mocha --forbid-only \\"test/**/*.test.ts\\""\n}\n')),(0,a.kt)("p",null,"The only thing that you should add to this line are the ",(0,a.kt)("inlineCode",{parentName:"p"},"AWS_SDK_LOAD_CONFIG=true")," and ",(0,a.kt)("inlineCode",{parentName:"p"},"BOOSTER_ENV=test")," environment variables, so the script will look like this:"),(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript"},'"scripts": {\n "test": "AWS_SDK_LOAD_CONFIG=true BOOSTER_ENV=test nyc --extension .ts mocha --forbid-only \\"test/**/*.test.ts\\""\n}\n')),(0,a.kt)("h3",{id:"testing-with-sinon-chai"},"Testing with ",(0,a.kt)("inlineCode",{parentName:"h3"},"sinon-chai")),(0,a.kt)("p",null,"The ",(0,a.kt)("inlineCode",{parentName:"p"},"BoosterConfig")," can be accessed through the ",(0,a.kt)("inlineCode",{parentName:"p"},"Booster.config")," on any part of a Booster application. To properly mock it for your objective, we really recommend to use sinon ",(0,a.kt)("inlineCode",{parentName:"p"},"replace")," method, after configuring your ",(0,a.kt)("inlineCode",{parentName:"p"},"Booster.config")," as desired."),(0,a.kt)("p",null,'In the example below, we add 2 "empty" read-models, since we are iterating ',(0,a.kt)("inlineCode",{parentName:"p"},"Booster.config.readModels")," from a command handler:"),(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript"},"// Test\nimport { replace } from 'sinon'\n\nconst config = new BoosterConfig('test')\nconfig.appName = 'testing-time'\nconfig.providerPackage = '@boostercloud/framework-provider-aws'\nconfig.readModels['WoW'] = {} as ReadModelMetadata\nconfig.readModels['Amazing'] = {} as ReadModelMetadata\nreplace(Booster, 'config', config)\n\nconst spyMyCall = spy(MyCommand, 'myCall')\nconst command = new MyCommand('1', true)\nconst register = new Register('request-id-1')\nconst registerSpy = spy(register, 'events')\nawait MyCommand.handle(command, register)\n\nexpect(spyMyCall).to.have.been.calledOnceWithExactly('WoW')\nexpect(spyMyCall).to.have.been.calledOnceWithExactly('Amazing')\nexpect(registerSpy).to.have.been.calledOnceWithExactly(new MyEvent('1', 'WoW'))\nexpect(registerSpy).to.have.been.calledOnceWithExactly(new MyEvent('1', 'Amazing'))\n")),(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript"},"// Example code\npublic static async handle(command: MyCommand, register: Register): Promise {\n const readModels = Booster.config.readModels\n for (const readModelName in readModels) {\n myCall(readModelName)\n register.events(new MyEvent(command.ID, readModelName))\n }\n}\n")),(0,a.kt)("h3",{id:"recommended-files"},"Recommended files"),(0,a.kt)("p",null,"These are some files that might help you speed up your testing with Booster."),(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript"},"// /test/expect.ts\nimport * as chai from 'chai'\n\nchai.use(require('sinon-chai'))\nchai.use(require('chai-as-promised'))\n\nexport const expect = chai.expect\n")),(0,a.kt)("p",null,"This ",(0,a.kt)("inlineCode",{parentName:"p"},"expect")," method will help you with some more additional methods like ",(0,a.kt)("inlineCode",{parentName:"p"},"expect().to.have.been.calledOnceWithExactly()")),(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-yaml"},"# /.mocharc.yml\ndiff: true\nrequire: 'ts-node/register'\nextension:\n - ts\npackage: './package.json'\nrecursive: true\nreporter: 'spec'\ntimeout: 5000\nfull-trace: true\nbail: true\n")),(0,a.kt)("h2",{id:"framework-integration-tests"},"Framework integration tests"),(0,a.kt)("p",null,"Booster framework integration tests package is used to test the Booster project itself, but it is also an example of how a Booster application could be tested. We encourage developers to have a look at our ",(0,a.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/booster/tree/main/packages/framework-integration-tests"},"Booster project repository"),"."),(0,a.kt)("p",null,"Some integration tests highly depend on the provider chosen for the project, and the infrastructure is normally deployed in the cloud right before the tests run. Once tests are completed, the application is teared down."),(0,a.kt)("p",null,"There are several types of integration tests in this package:"),(0,a.kt)("ul",null,(0,a.kt)("li",{parentName:"ul"},"Tests to ensure that different packages integrate as expected with each other."),(0,a.kt)("li",{parentName:"ul"},"Tests to ensure that a Booster application behaves as expected when it is hit by a client (a GraphQL client)."),(0,a.kt)("li",{parentName:"ul"},"Tests to ensure that the application behaves in the same way no matter what provider is selected.")))}d.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/09ff0a1d.58bfb1e8.js b/assets/js/09ff0a1d.58bfb1e8.js deleted file mode 100644 index 41a5d2982..000000000 --- a/assets/js/09ff0a1d.58bfb1e8.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[5263],{3905:(e,t,a)=>{a.d(t,{Zo:()=>c,kt:()=>u});var n=a(7294);function i(e,t,a){return t in e?Object.defineProperty(e,t,{value:a,enumerable:!0,configurable:!0,writable:!0}):e[t]=a,e}function o(e,t){var a=Object.keys(e);if(Object.getOwnPropertySymbols){var n=Object.getOwnPropertySymbols(e);t&&(n=n.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),a.push.apply(a,n)}return a}function r(e){for(var t=1;t=0||(i[a]=e[a]);return i}(e,t);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);for(n=0;n=0||Object.prototype.propertyIsEnumerable.call(e,a)&&(i[a]=e[a])}return i}var d=n.createContext({}),l=function(e){var t=n.useContext(d),a=t;return e&&(a="function"==typeof e?e(t):r(r({},t),e)),a},c=function(e){var t=l(e.components);return n.createElement(d.Provider,{value:t},e.children)},p={inlineCode:"code",wrapper:function(e){var t=e.children;return n.createElement(n.Fragment,{},t)}},m=n.forwardRef((function(e,t){var a=e.components,i=e.mdxType,o=e.originalType,d=e.parentName,c=s(e,["components","mdxType","originalType","parentName"]),m=l(a),u=i,h=m["".concat(d,".").concat(u)]||m[u]||p[u]||o;return a?n.createElement(h,r(r({ref:t},c),{},{components:a})):n.createElement(h,r({ref:t},c))}));function u(e,t){var a=arguments,i=t&&t.mdxType;if("string"==typeof e||i){var o=a.length,r=new Array(o);r[0]=m;var s={};for(var d in t)hasOwnProperty.call(t,d)&&(s[d]=t[d]);s.originalType=e,s.mdxType="string"==typeof e?e:i,r[1]=s;for(var l=2;l{a.r(t),a.d(t,{assets:()=>d,contentTitle:()=>r,default:()=>p,frontMatter:()=>o,metadata:()=>s,toc:()=>l});var n=a(7462),i=(a(7294),a(3905));const o={description:"Learn how to migrate data in Booster"},r="Migrations",s={unversionedId:"going-deeper/data-migrations",id:"going-deeper/data-migrations",title:"Migrations",description:"Learn how to migrate data in Booster",source:"@site/docs/10_going-deeper/data-migrations.md",sourceDirName:"10_going-deeper",slug:"/going-deeper/data-migrations",permalink:"/going-deeper/data-migrations",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/10_going-deeper/data-migrations.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",frontMatter:{description:"Learn how to migrate data in Booster"},sidebar:"docs",previous:{title:"Testing",permalink:"/going-deeper/testing"},next:{title:"TouchEntities",permalink:"/going-deeper/touch-entities"}},d={},l=[{value:"Schema migrations",id:"schema-migrations",level:2},{value:"Data migrations",id:"data-migrations",level:2}],c={toc:l};function p(e){let{components:t,...a}=e;return(0,i.kt)("wrapper",(0,n.Z)({},c,a,{components:t,mdxType:"MDXLayout"}),(0,i.kt)("h1",{id:"migrations"},"Migrations"),(0,i.kt)("p",null,"Migrations are a mechanism for updating or transforming the schemas of events and entities as your system evolves. This allows you to make changes to your data model without losing or corrupting existing data. There are two types of migration tools available in Booster: schema migrations and data migrations."),(0,i.kt)("ul",null,(0,i.kt)("li",{parentName:"ul"},(0,i.kt)("p",{parentName:"li"},(0,i.kt)("strong",{parentName:"p"},"Schema migrations")," are used to incrementally upgrade an event or entity from a past version to the next. They are applied lazily, meaning that they are performed on-the-fly whenever an event or entity is loaded. This allows you to make changes to your data model without having to manually update all existing artifacts, and makes it possible to apply changes without running lenghty migration processes.")),(0,i.kt)("li",{parentName:"ul"},(0,i.kt)("p",{parentName:"li"},(0,i.kt)("strong",{parentName:"p"},"Data migrations"),", on the other hand, behave as background processes that can actively change the existing values in the database for existing entities and read models. They are particularly useful for data migrations that cannot be performed automatically with schema migrations, or for updating existing read models after a schema change."))),(0,i.kt)("p",null,"Together, schema and data migrations provide a flexible and powerful toolset for managing the evolution of your data model over time."),(0,i.kt)("h2",{id:"schema-migrations"},"Schema migrations"),(0,i.kt)("p",null,"Booster handles classes annotated with ",(0,i.kt)("inlineCode",{parentName:"p"},"@Migrates")," as ",(0,i.kt)("strong",{parentName:"p"},"schema migrations"),". The migration functions defined inside will update an existing artifact (either an event or an entity) from a previous version to a newer one whenever that artifact is visited. Schema migrations are applied to events and entities lazyly, meaning that they are only applied when the event or entity is loaded. This ensures that the migration process is non-disruptive and does not affect the performance of your system. Schema migrations are also performed on-the-fly and the results are not written back to the database, as events are not revisited once the next snapshot is written in the database."),(0,i.kt)("p",null,"For example, to upgrade a ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," entity from version 1 to version 2, you can write the following migration class:"),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript"},"@Migrates(Product)\nexport class ProductMigration {\n @ToVersion(2, { fromSchema: ProductV1, toSchema: ProductV2 })\n public async changeNameFieldToDisplayName(old: ProductV1): Promise {\n return new ProductV2(\n old.id,\n old.sku,\n old.name,\n old.description,\n old.price,\n old.pictures,\n old.deleted\n )\n }\n}\n")),(0,i.kt)("p",null,"Notice that we've used the ",(0,i.kt)("inlineCode",{parentName:"p"},"@ToVersion")," decorator in the above example. This decorator not only tells Booster what schema upgrade this migration performs, it also informs it about the existence of a version, which is always an integer number. Booster will always use the latest version known to tag newly created artifacts, defaulting to 1 when no migrations are defined. This ensures that the schema of newly created events and entities is up-to-date and that they can be migrated as needed in the future."),(0,i.kt)("p",null,"The ",(0,i.kt)("inlineCode",{parentName:"p"},"@ToVersion")," decorator takes two parameters in addition to the version: ",(0,i.kt)("inlineCode",{parentName:"p"},"fromSchema")," and ",(0,i.kt)("inlineCode",{parentName:"p"},"toSchema"),". The fromSchema parameter is set to ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV1"),", while the ",(0,i.kt)("inlineCode",{parentName:"p"},"toSchema")," parameter is set to ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV2"),". This tells Booster that the migration is updating the ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," object from version 1 (as defined by the ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV1")," schema) to version 2 (as defined by the ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV2")," schema)."),(0,i.kt)("p",null,"As Booster can easily read the structure of your classes, the schemas are described as plain classes that you can maintain as part of your code. The ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV1")," class represents the schema of the previous version of the ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," object with the properties and structure of the ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," object as it was defined in version 1. The ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV2")," class is an alias for the latest version of the Product object. You can use the ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," class here, there's no difference, but it's a good practice to create an alias for clarity."),(0,i.kt)("p",null,"It's a good practice to define the schema classes (",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV1")," and ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV2"),") as non-exported classes in the same migration file. This allows you to see the changes made between versions and helps to understand how the migration works:"),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript"},"class ProductV1 {\n public constructor(\n public id: UUID,\n readonly sku: string,\n readonly name: string,\n readonly description: string,\n readonly price: Money,\n readonly pictures: Array,\n public deleted: boolean = false\n ) {}\n}\n\nclass ProductV2 extends Product {}\n")),(0,i.kt)("p",null,"When you want to upgrade your artifacts from V2 to V3, you can add a new function decorated with ",(0,i.kt)("inlineCode",{parentName:"p"},"@ToVersion")," to the same migrations class. You're free to structure the code the way you want, but we recommend keeping all migrations for the same artifact in the same migration class. For instance:"),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript"},"@Migrates(Product)\nexport class ProductMigration {\n @ToVersion(2, { fromSchema: ProductV1, toSchema: ProductV2 })\n public async changeNameFieldToDisplayName(old: ProductV1): Promise {\n return new ProductV2(\n old.id,\n old.sku,\n old.name, // It's now called `displayName`\n old.description,\n old.price,\n old.pictures,\n old.deleted\n )\n }\n\n @ToVersion(3, { fromSchema: ProductV2, toSchema: ProductV3 })\n public async addNewField(old: ProductV2): Promise {\n return new ProductV3(\n old.id,\n old.sku,\n old.displayName,\n old.description,\n old.price,\n old.pictures,\n old.deleted,\n 42 // We set a default value to initialize this field\n )\n }\n}\n")),(0,i.kt)("p",null,"In this example, the ",(0,i.kt)("inlineCode",{parentName:"p"},"changeNameFieldToDisplayName")," function updates the ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," entity from version 1 to version 2 by renaming the ",(0,i.kt)("inlineCode",{parentName:"p"},"name")," field to ",(0,i.kt)("inlineCode",{parentName:"p"},"displayName"),". Then, ",(0,i.kt)("inlineCode",{parentName:"p"},"addNewField")," function updates the ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," entity from version 2 to version 3 by adding a new field called ",(0,i.kt)("inlineCode",{parentName:"p"},"newField")," to the entity's schema. Notice that at this point, your database could have snapshots set as v1, v2, or v3, so while it might be tempting to redefine the original migration to keep a single 1-to-3 migration, it's usually a good idea to keep the intermediate steps. This way Booster will be able to handle any scenario."),(0,i.kt)("h2",{id:"data-migrations"},"Data migrations"),(0,i.kt)("p",null,"Data migrations can be seen as background processes that can actively update the values of existing entities and read models in the database. They can be useful to perform data migrations that cannot be handled with schema migrations, for example when you need to update the values exposed by the GraphQL API, or to initialize new read models that are projections of previously existing entities."),(0,i.kt)("p",null,"To create a data migration in Booster, you can use the ",(0,i.kt)("inlineCode",{parentName:"p"},"@DataMigration")," decorator on a class that implements a ",(0,i.kt)("inlineCode",{parentName:"p"},"start")," method. The ",(0,i.kt)("inlineCode",{parentName:"p"},"@DataMigration")," decorator takes an object with a single parameter, ",(0,i.kt)("inlineCode",{parentName:"p"},"order"),", which specifies the order in which the data migration should be run relative to other data migrations."),(0,i.kt)("p",null,"Data migrations are not run automatically, you need to invoke the ",(0,i.kt)("inlineCode",{parentName:"p"},"BoosterDataMigrations.run()")," method from an event handler or a command. This will emit a ",(0,i.kt)("inlineCode",{parentName:"p"},"BoosterDataMigrationStarted")," event, which will make Booster check for any pending migrations and run them in the specified order. A common pattern to be able to run migrations on demand is to add a special command, with access limited to an administrator role which calls this function. "),(0,i.kt)("p",null,"Take into account that, depending on your cloud provider implementation, data migrations are executed in the context of a lambda or function app, so it's advisable to design these functions in a way that allow to re-run them in case of failures (i.e. lambda timeouts). In order to tell Booster that your migration has been applied successfully, at the end of each ",(0,i.kt)("inlineCode",{parentName:"p"},"DataMigration.start")," method, you must emit a ",(0,i.kt)("inlineCode",{parentName:"p"},"BoosterDataMigrationFinished")," event manually."),(0,i.kt)("p",null,"Inside your ",(0,i.kt)("inlineCode",{parentName:"p"},"@DataMigration")," classes, you can use the ",(0,i.kt)("inlineCode",{parentName:"p"},"BoosterDataMigrations.migrateEntity")," method to update the data for a specific entity. This method takes the old entity name, the old entity ID, and the new entity data as arguments. It will also generate an internal ",(0,i.kt)("inlineCode",{parentName:"p"},"BoosterEntityMigrated")," event before performing the migration."),(0,i.kt)("p",null,(0,i.kt)("strong",{parentName:"p"},"Note that Data migrations are only available in the Azure provider at the moment.")),(0,i.kt)("p",null,"Here is an example of how you might use the ",(0,i.kt)("inlineCode",{parentName:"p"},"@DataMigration")," decorator and the ",(0,i.kt)("inlineCode",{parentName:"p"},"Booster.migrateEntity")," method to update the quantity of the first item in a cart (",(0,i.kt)("strong",{parentName:"p"},"Notice that at the time of writing this document, the method ",(0,i.kt)("inlineCode",{parentName:"strong"},"Booster.entitiesIDs")," used in the following example is only available in the Azure provider, so you may need to approach the migration differently in AWS."),"):"),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript"},"@DataMigration({\n order: 2,\n})\nexport class CartIdDataMigrateV2 {\n public constructor() {}\n\n\n public static async start(register: Register): Promise {\n const entitiesIdsResult = await Booster.entitiesIDs('Cart', 500, undefined)\n const paginatedEntityIdResults = entitiesIdsResult.items\n\n const carts = await Promise.all(\n paginatedEntityIdResults.map(async (entity) => await Booster.entity(Cart, entity.entityID))\n )\n return await Promise.all(\n carts.map(async (cart) => {\n cart.cartItems[0].quantity = 100\n const newCart = new Cart(cart.id, cart.cartItems, cart.shippingAddress, cart.checks)\n await BoosterDataMigrations.migrateEntity('Cart', validCart.id, newCart)\n return validCart.id\n })\n )\n\n register.events(new BoosterDataMigrationFinished('CartIdDataMigrateV2'))\n }\n}\n")),(0,i.kt)("h1",{id:"migrate-from-previous-booster-versions"},"Migrate from Previous Booster Versions"),(0,i.kt)("ul",null,(0,i.kt)("li",{parentName:"ul"},"To migrate to new versions of Booster, check that you have the latest development dependencies required:")),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-json"},'"devDependencies": {\n "rimraf": "^5.0.0",\n "@typescript-eslint/eslint-plugin": "4.22.1",\n "@typescript-eslint/parser": "4.22.1",\n "eslint": "7.26.0",\n "eslint-config-prettier": "8.3.0",\n "eslint-plugin-prettier": "3.4.0",\n "mocha": "10.2.0",\n "@types/mocha": "10.0.1",\n "nyc": "15.1.0",\n "prettier": "2.3.0",\n "typescript": "4.5.4",\n "ts-node": "9.1.1",\n "@types/node": "15.0.2",\n "ttypescript": "1.5.15",\n "@boostercloud/metadata-booster": "0.30.2"\n },\n')))}p.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/09ff0a1d.fa6ffb03.js b/assets/js/09ff0a1d.fa6ffb03.js new file mode 100644 index 000000000..87a96e4bc --- /dev/null +++ b/assets/js/09ff0a1d.fa6ffb03.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[5263],{3905:(e,t,a)=>{a.d(t,{Zo:()=>c,kt:()=>u});var n=a(7294);function i(e,t,a){return t in e?Object.defineProperty(e,t,{value:a,enumerable:!0,configurable:!0,writable:!0}):e[t]=a,e}function o(e,t){var a=Object.keys(e);if(Object.getOwnPropertySymbols){var n=Object.getOwnPropertySymbols(e);t&&(n=n.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),a.push.apply(a,n)}return a}function r(e){for(var t=1;t=0||(i[a]=e[a]);return i}(e,t);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);for(n=0;n=0||Object.prototype.propertyIsEnumerable.call(e,a)&&(i[a]=e[a])}return i}var d=n.createContext({}),l=function(e){var t=n.useContext(d),a=t;return e&&(a="function"==typeof e?e(t):r(r({},t),e)),a},c=function(e){var t=l(e.components);return n.createElement(d.Provider,{value:t},e.children)},p={inlineCode:"code",wrapper:function(e){var t=e.children;return n.createElement(n.Fragment,{},t)}},m=n.forwardRef((function(e,t){var a=e.components,i=e.mdxType,o=e.originalType,d=e.parentName,c=s(e,["components","mdxType","originalType","parentName"]),m=l(a),u=i,h=m["".concat(d,".").concat(u)]||m[u]||p[u]||o;return a?n.createElement(h,r(r({ref:t},c),{},{components:a})):n.createElement(h,r({ref:t},c))}));function u(e,t){var a=arguments,i=t&&t.mdxType;if("string"==typeof e||i){var o=a.length,r=new Array(o);r[0]=m;var s={};for(var d in t)hasOwnProperty.call(t,d)&&(s[d]=t[d]);s.originalType=e,s.mdxType="string"==typeof e?e:i,r[1]=s;for(var l=2;l{a.r(t),a.d(t,{assets:()=>d,contentTitle:()=>r,default:()=>p,frontMatter:()=>o,metadata:()=>s,toc:()=>l});var n=a(7462),i=(a(7294),a(3905));const o={description:"Learn how to migrate data in Booster"},r="Migrations",s={unversionedId:"going-deeper/data-migrations",id:"going-deeper/data-migrations",title:"Migrations",description:"Learn how to migrate data in Booster",source:"@site/docs/10_going-deeper/data-migrations.md",sourceDirName:"10_going-deeper",slug:"/going-deeper/data-migrations",permalink:"/going-deeper/data-migrations",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/10_going-deeper/data-migrations.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",frontMatter:{description:"Learn how to migrate data in Booster"},sidebar:"docs",previous:{title:"Testing",permalink:"/going-deeper/testing"},next:{title:"TouchEntities",permalink:"/going-deeper/touch-entities"}},d={},l=[{value:"Schema migrations",id:"schema-migrations",level:2},{value:"Data migrations",id:"data-migrations",level:2}],c={toc:l};function p(e){let{components:t,...a}=e;return(0,i.kt)("wrapper",(0,n.Z)({},c,a,{components:t,mdxType:"MDXLayout"}),(0,i.kt)("h1",{id:"migrations"},"Migrations"),(0,i.kt)("p",null,"Migrations are a mechanism for updating or transforming the schemas of events and entities as your system evolves. This allows you to make changes to your data model without losing or corrupting existing data. There are two types of migration tools available in Booster: schema migrations and data migrations."),(0,i.kt)("ul",null,(0,i.kt)("li",{parentName:"ul"},(0,i.kt)("p",{parentName:"li"},(0,i.kt)("strong",{parentName:"p"},"Schema migrations")," are used to incrementally upgrade an event or entity from a past version to the next. They are applied lazily, meaning that they are performed on-the-fly whenever an event or entity is loaded. This allows you to make changes to your data model without having to manually update all existing artifacts, and makes it possible to apply changes without running lenghty migration processes.")),(0,i.kt)("li",{parentName:"ul"},(0,i.kt)("p",{parentName:"li"},(0,i.kt)("strong",{parentName:"p"},"Data migrations"),", on the other hand, behave as background processes that can actively change the existing values in the database for existing entities and read models. They are particularly useful for data migrations that cannot be performed automatically with schema migrations, or for updating existing read models after a schema change."))),(0,i.kt)("p",null,"Together, schema and data migrations provide a flexible and powerful toolset for managing the evolution of your data model over time."),(0,i.kt)("h2",{id:"schema-migrations"},"Schema migrations"),(0,i.kt)("p",null,"Booster handles classes annotated with ",(0,i.kt)("inlineCode",{parentName:"p"},"@Migrates")," as ",(0,i.kt)("strong",{parentName:"p"},"schema migrations"),". The migration functions defined inside will update an existing artifact (either an event or an entity) from a previous version to a newer one whenever that artifact is visited. Schema migrations are applied to events and entities lazyly, meaning that they are only applied when the event or entity is loaded. This ensures that the migration process is non-disruptive and does not affect the performance of your system. Schema migrations are also performed on-the-fly and the results are not written back to the database, as events are not revisited once the next snapshot is written in the database."),(0,i.kt)("p",null,"For example, to upgrade a ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," entity from version 1 to version 2, you can write the following migration class:"),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript"},"@Migrates(Product)\nexport class ProductMigration {\n @ToVersion(2, { fromSchema: ProductV1, toSchema: ProductV2 })\n public async changeNameFieldToDisplayName(old: ProductV1): Promise {\n return new ProductV2(\n old.id,\n old.sku,\n old.name,\n old.description,\n old.price,\n old.pictures,\n old.deleted\n )\n }\n}\n")),(0,i.kt)("p",null,"Notice that we've used the ",(0,i.kt)("inlineCode",{parentName:"p"},"@ToVersion")," decorator in the above example. This decorator not only tells Booster what schema upgrade this migration performs, it also informs it about the existence of a version, which is always an integer number. Booster will always use the latest version known to tag newly created artifacts, defaulting to 1 when no migrations are defined. This ensures that the schema of newly created events and entities is up-to-date and that they can be migrated as needed in the future."),(0,i.kt)("p",null,"The ",(0,i.kt)("inlineCode",{parentName:"p"},"@ToVersion")," decorator takes two parameters in addition to the version: ",(0,i.kt)("inlineCode",{parentName:"p"},"fromSchema")," and ",(0,i.kt)("inlineCode",{parentName:"p"},"toSchema"),". The fromSchema parameter is set to ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV1"),", while the ",(0,i.kt)("inlineCode",{parentName:"p"},"toSchema")," parameter is set to ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV2"),". This tells Booster that the migration is updating the ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," object from version 1 (as defined by the ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV1")," schema) to version 2 (as defined by the ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV2")," schema)."),(0,i.kt)("p",null,"As Booster can easily read the structure of your classes, the schemas are described as plain classes that you can maintain as part of your code. The ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV1")," class represents the schema of the previous version of the ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," object with the properties and structure of the ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," object as it was defined in version 1. The ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV2")," class is an alias for the latest version of the Product object. You can use the ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," class here, there's no difference, but it's a good practice to create an alias for clarity."),(0,i.kt)("p",null,"It's a good practice to define the schema classes (",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV1")," and ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductV2"),") as non-exported classes in the same migration file. This allows you to see the changes made between versions and helps to understand how the migration works:"),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript"},"class ProductV1 {\n public constructor(\n public id: UUID,\n readonly sku: string,\n readonly name: string,\n readonly description: string,\n readonly price: Money,\n readonly pictures: Array,\n public deleted: boolean = false\n ) {}\n}\n\nclass ProductV2 extends Product {}\n")),(0,i.kt)("p",null,"When you want to upgrade your artifacts from V2 to V3, you can add a new function decorated with ",(0,i.kt)("inlineCode",{parentName:"p"},"@ToVersion")," to the same migrations class. You're free to structure the code the way you want, but we recommend keeping all migrations for the same artifact in the same migration class. For instance:"),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript"},"@Migrates(Product)\nexport class ProductMigration {\n @ToVersion(2, { fromSchema: ProductV1, toSchema: ProductV2 })\n public async changeNameFieldToDisplayName(old: ProductV1): Promise {\n return new ProductV2(\n old.id,\n old.sku,\n old.name, // It's now called `displayName`\n old.description,\n old.price,\n old.pictures,\n old.deleted\n )\n }\n\n @ToVersion(3, { fromSchema: ProductV2, toSchema: ProductV3 })\n public async addNewField(old: ProductV2): Promise {\n return new ProductV3(\n old.id,\n old.sku,\n old.displayName,\n old.description,\n old.price,\n old.pictures,\n old.deleted,\n 42 // We set a default value to initialize this field\n )\n }\n}\n")),(0,i.kt)("p",null,"In this example, the ",(0,i.kt)("inlineCode",{parentName:"p"},"changeNameFieldToDisplayName")," function updates the ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," entity from version 1 to version 2 by renaming the ",(0,i.kt)("inlineCode",{parentName:"p"},"name")," field to ",(0,i.kt)("inlineCode",{parentName:"p"},"displayName"),". Then, ",(0,i.kt)("inlineCode",{parentName:"p"},"addNewField")," function updates the ",(0,i.kt)("inlineCode",{parentName:"p"},"Product")," entity from version 2 to version 3 by adding a new field called ",(0,i.kt)("inlineCode",{parentName:"p"},"newField")," to the entity's schema. Notice that at this point, your database could have snapshots set as v1, v2, or v3, so while it might be tempting to redefine the original migration to keep a single 1-to-3 migration, it's usually a good idea to keep the intermediate steps. This way Booster will be able to handle any scenario."),(0,i.kt)("h2",{id:"data-migrations"},"Data migrations"),(0,i.kt)("p",null,"Data migrations can be seen as background processes that can actively update the values of existing entities and read models in the database. They can be useful to perform data migrations that cannot be handled with schema migrations, for example when you need to update the values exposed by the GraphQL API, or to initialize new read models that are projections of previously existing entities."),(0,i.kt)("p",null,"To create a data migration in Booster, you can use the ",(0,i.kt)("inlineCode",{parentName:"p"},"@DataMigration")," decorator on a class that implements a ",(0,i.kt)("inlineCode",{parentName:"p"},"start")," method. The ",(0,i.kt)("inlineCode",{parentName:"p"},"@DataMigration")," decorator takes an object with a single parameter, ",(0,i.kt)("inlineCode",{parentName:"p"},"order"),", which specifies the order in which the data migration should be run relative to other data migrations."),(0,i.kt)("p",null,"Data migrations are not run automatically, you need to invoke the ",(0,i.kt)("inlineCode",{parentName:"p"},"BoosterDataMigrations.run()")," method from an event handler or a command. This will emit a ",(0,i.kt)("inlineCode",{parentName:"p"},"BoosterDataMigrationStarted")," event, which will make Booster check for any pending migrations and run them in the specified order. A common pattern to be able to run migrations on demand is to add a special command, with access limited to an administrator role which calls this function. "),(0,i.kt)("p",null,"Take into account that, depending on your cloud provider implementation, data migrations are executed in the context of a lambda or function app, so it's advisable to design these functions in a way that allow to re-run them in case of failures (i.e. lambda timeouts). In order to tell Booster that your migration has been applied successfully, at the end of each ",(0,i.kt)("inlineCode",{parentName:"p"},"DataMigration.start")," method, you must emit a ",(0,i.kt)("inlineCode",{parentName:"p"},"BoosterDataMigrationFinished")," event manually."),(0,i.kt)("p",null,"Inside your ",(0,i.kt)("inlineCode",{parentName:"p"},"@DataMigration")," classes, you can use the ",(0,i.kt)("inlineCode",{parentName:"p"},"BoosterDataMigrations.migrateEntity")," method to update the data for a specific entity. This method takes the old entity name, the old entity ID, and the new entity data as arguments. It will also generate an internal ",(0,i.kt)("inlineCode",{parentName:"p"},"BoosterEntityMigrated")," event before performing the migration."),(0,i.kt)("p",null,(0,i.kt)("strong",{parentName:"p"},"Note that Data migrations are only available in the Azure provider at the moment.")),(0,i.kt)("p",null,"Here is an example of how you might use the ",(0,i.kt)("inlineCode",{parentName:"p"},"@DataMigration")," decorator and the ",(0,i.kt)("inlineCode",{parentName:"p"},"Booster.migrateEntity")," method to update the quantity of the first item in a cart (",(0,i.kt)("strong",{parentName:"p"},"Notice that at the time of writing this document, the method ",(0,i.kt)("inlineCode",{parentName:"strong"},"Booster.entitiesIDs")," used in the following example is only available in the Azure provider, so you may need to approach the migration differently in AWS."),"):"),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript"},"@DataMigration({\n order: 2,\n})\nexport class CartIdDataMigrateV2 {\n public constructor() {}\n\n\n public static async start(register: Register): Promise {\n const entitiesIdsResult = await Booster.entitiesIDs('Cart', 500, undefined)\n const paginatedEntityIdResults = entitiesIdsResult.items\n\n const carts = await Promise.all(\n paginatedEntityIdResults.map(async (entity) => await Booster.entity(Cart, entity.entityID))\n )\n return await Promise.all(\n carts.map(async (cart) => {\n cart.cartItems[0].quantity = 100\n const newCart = new Cart(cart.id, cart.cartItems, cart.shippingAddress, cart.checks)\n await BoosterDataMigrations.migrateEntity('Cart', validCart.id, newCart)\n return validCart.id\n })\n )\n\n register.events(new BoosterDataMigrationFinished('CartIdDataMigrateV2'))\n }\n}\n")),(0,i.kt)("h1",{id:"migrate-from-previous-booster-versions"},"Migrate from Previous Booster Versions"),(0,i.kt)("ul",null,(0,i.kt)("li",{parentName:"ul"},"To migrate to new versions of Booster, check that you have the latest development dependencies required:")),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-json"},'"devDependencies": {\n "rimraf": "^5.0.0",\n "@typescript-eslint/eslint-plugin": "4.22.1",\n "@typescript-eslint/parser": "4.22.1",\n "eslint": "7.26.0",\n "eslint-config-prettier": "8.3.0",\n "eslint-plugin-prettier": "3.4.0",\n "mocha": "10.2.0",\n "@types/mocha": "10.0.1",\n "nyc": "15.1.0",\n "prettier": "2.3.0",\n "typescript": "4.5.4",\n "ts-node": "9.1.1",\n "@types/node": "15.0.2",\n "ttypescript": "1.5.15",\n "@boostercloud/metadata-booster": "0.30.2"\n },\n')))}p.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/10057e71.c100e740.js b/assets/js/10057e71.c100e740.js new file mode 100644 index 000000000..244bcc274 --- /dev/null +++ b/assets/js/10057e71.c100e740.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[4274],{3905:(e,t,n)=>{n.d(t,{Zo:()=>c,kt:()=>u});var a=n(7294);function r(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function o(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);t&&(a=a.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,a)}return n}function l(e){for(var t=1;t=0||(r[n]=e[n]);return r}(e,t);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);for(a=0;a=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(r[n]=e[n])}return r}var d=a.createContext({}),p=function(e){var t=a.useContext(d),n=t;return e&&(n="function"==typeof e?e(t):l(l({},t),e)),n},c=function(e){var t=p(e.components);return a.createElement(d.Provider,{value:t},e.children)},m={inlineCode:"code",wrapper:function(e){var t=e.children;return a.createElement(a.Fragment,{},t)}},s=a.forwardRef((function(e,t){var n=e.components,r=e.mdxType,o=e.originalType,d=e.parentName,c=i(e,["components","mdxType","originalType","parentName"]),s=p(n),u=r,h=s["".concat(d,".").concat(u)]||s[u]||m[u]||o;return n?a.createElement(h,l(l({ref:t},c),{},{components:n})):a.createElement(h,l({ref:t},c))}));function u(e,t){var n=arguments,r=t&&t.mdxType;if("string"==typeof e||r){var o=n.length,l=new Array(o);l[0]=s;var i={};for(var d in t)hasOwnProperty.call(t,d)&&(i[d]=t[d]);i.originalType=e,i.mdxType="string"==typeof e?e:r,l[1]=i;for(var p=2;p{n.d(t,{Z:()=>p});var a=n(7294);const r="terminalWindow_wGrl",o="terminalWindowHeader_o9Cs",l="buttons_IGLB",i="dot_fGZE",d="terminalWindowBody_tzdS";function p(e){let{children:t}=e;return a.createElement("div",{className:r},a.createElement("div",{className:o},a.createElement("div",{className:l},a.createElement("span",{className:i,style:{background:"#f25f58"}}),a.createElement("span",{className:i,style:{background:"#fbbe3c"}}),a.createElement("span",{className:i,style:{background:"#58cb42"}}))),a.createElement("div",{className:d},t))}},5024:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>p,contentTitle:()=>i,default:()=>s,frontMatter:()=>l,metadata:()=>d,toc:()=>c});var a=n(7462),r=(n(7294),n(3905)),o=n(5163);const l={},i="Booster CLI",d={unversionedId:"booster-cli",id:"booster-cli",title:"Booster CLI",description:"Booster CLI is a command line interface that helps you to create, develop, and deploy your Booster applications. It is built with Node.js and published to NPM through the package @boostercloud/cli . You can install it using any compatible package manager. If you want to contribute to the project, you will also need to clone the GitHub repository and compile the source code.",source:"@site/docs/05_booster-cli.mdx",sourceDirName:".",slug:"/booster-cli",permalink:"/booster-cli",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/05_booster-cli.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",sidebarPosition:5,frontMatter:{},sidebar:"docs",previous:{title:"GraphQL API",permalink:"/graphql"},next:{title:"Going deeper with Booster",permalink:"/category/going-deeper-with-booster"}},p={},c=[{value:"Installation",id:"installation",level:2},{value:"Usage",id:"usage",level:2},{value:"Command Overview",id:"command-overview",level:2}],m={toc:c};function s(e){let{components:t,...n}=e;return(0,r.kt)("wrapper",(0,a.Z)({},m,n,{components:t,mdxType:"MDXLayout"}),(0,r.kt)("h1",{id:"booster-cli"},"Booster CLI"),(0,r.kt)("p",null,"Booster CLI is a command line interface that helps you to create, develop, and deploy your Booster applications. It is built with Node.js and published to NPM through the package ",(0,r.kt)("inlineCode",{parentName:"p"},"@boostercloud/cli")," . You can install it using any compatible package manager. If you want to contribute to the project, you will also need to clone the GitHub repository and compile the source code."),(0,r.kt)("h2",{id:"installation"},"Installation"),(0,r.kt)("p",null,"The preferred way to install the Booster CLI is through NPM. You can install it following the instructions in the ",(0,r.kt)("a",{parentName:"p",href:"https://nodejs.org/en/download/"},"Node.js website"),"."),(0,r.kt)("p",null,"Once you have NPM installed, you can install the Booster CLI by running this command:"),(0,r.kt)(o.Z,{mdxType:"TerminalWindow"},(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},"npm install -g @boostercloud/cli\n"))),(0,r.kt)("h2",{id:"usage"},"Usage"),(0,r.kt)("p",null,"Once the installation is finished, you will have the ",(0,r.kt)("inlineCode",{parentName:"p"},"boost")," command available in your terminal. You can run it to see the help message."),(0,r.kt)("admonition",{type:"tip"},(0,r.kt)("p",{parentName:"admonition"},"You can also run ",(0,r.kt)("inlineCode",{parentName:"p"},"boost --help")," to get the same output.")),(0,r.kt)("h2",{id:"command-overview"},"Command Overview"),(0,r.kt)("table",null,(0,r.kt)("thead",{parentName:"table"},(0,r.kt)("tr",{parentName:"thead"},(0,r.kt)("th",{parentName:"tr",align:null},"Command"),(0,r.kt)("th",{parentName:"tr",align:null},"Description"))),(0,r.kt)("tbody",{parentName:"table"},(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"#new"},(0,r.kt)("inlineCode",{parentName:"a"},"new:project"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new Booster project in a new directory")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/architecture/command#creating-a-command"},(0,r.kt)("inlineCode",{parentName:"a"},"new:command"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new command in the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/architecture/entity#creating-an-entity"},(0,r.kt)("inlineCode",{parentName:"a"},"new:entity"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new entity in the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/architecture/event#creating-an-event"},(0,r.kt)("inlineCode",{parentName:"a"},"new:event"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new event in the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/architecture/event-handler#creating-an-event-handler"},(0,r.kt)("inlineCode",{parentName:"a"},"new:event-handler"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new event handler in the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/architecture/read-model#creating-a-read-model"},(0,r.kt)("inlineCode",{parentName:"a"},"new:read-model"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new read model in the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/features/schedule-actions#creating-a-scheduled-command"},(0,r.kt)("inlineCode",{parentName:"a"},"new:scheduled-command"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new scheduled command in the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null}),(0,r.kt)("td",{parentName:"tr",align:null})),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/getting-started/coding#6-deployment"},(0,r.kt)("inlineCode",{parentName:"a"},"start -e "))),(0,r.kt)("td",{parentName:"tr",align:null},"Starts the project in development mode")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/getting-started/coding#6-deployment"},(0,r.kt)("inlineCode",{parentName:"a"},"build"))),(0,r.kt)("td",{parentName:"tr",align:null},"Builds the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/getting-started/coding#6-deployment"},(0,r.kt)("inlineCode",{parentName:"a"},"deploy -e "))),(0,r.kt)("td",{parentName:"tr",align:null},"Deploys the project to the cloud")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("inlineCode",{parentName:"td"},"nuke")),(0,r.kt)("td",{parentName:"tr",align:null},"Deletes all the resources created by the deploy command")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null}),(0,r.kt)("td",{parentName:"tr",align:null})))))}s.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/10057e71.f1d55577.js b/assets/js/10057e71.f1d55577.js deleted file mode 100644 index dc93014cd..000000000 --- a/assets/js/10057e71.f1d55577.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[4274],{3905:(e,t,n)=>{n.d(t,{Zo:()=>c,kt:()=>u});var a=n(7294);function r(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function o(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);t&&(a=a.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,a)}return n}function l(e){for(var t=1;t=0||(r[n]=e[n]);return r}(e,t);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);for(a=0;a=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(r[n]=e[n])}return r}var d=a.createContext({}),p=function(e){var t=a.useContext(d),n=t;return e&&(n="function"==typeof e?e(t):l(l({},t),e)),n},c=function(e){var t=p(e.components);return a.createElement(d.Provider,{value:t},e.children)},m={inlineCode:"code",wrapper:function(e){var t=e.children;return a.createElement(a.Fragment,{},t)}},s=a.forwardRef((function(e,t){var n=e.components,r=e.mdxType,o=e.originalType,d=e.parentName,c=i(e,["components","mdxType","originalType","parentName"]),s=p(n),u=r,h=s["".concat(d,".").concat(u)]||s[u]||m[u]||o;return n?a.createElement(h,l(l({ref:t},c),{},{components:n})):a.createElement(h,l({ref:t},c))}));function u(e,t){var n=arguments,r=t&&t.mdxType;if("string"==typeof e||r){var o=n.length,l=new Array(o);l[0]=s;var i={};for(var d in t)hasOwnProperty.call(t,d)&&(i[d]=t[d]);i.originalType=e,i.mdxType="string"==typeof e?e:r,l[1]=i;for(var p=2;p{n.d(t,{Z:()=>p});var a=n(7294);const r="terminalWindow_wGrl",o="terminalWindowHeader_o9Cs",l="buttons_IGLB",i="dot_fGZE",d="terminalWindowBody_tzdS";function p(e){let{children:t}=e;return a.createElement("div",{className:r},a.createElement("div",{className:o},a.createElement("div",{className:l},a.createElement("span",{className:i,style:{background:"#f25f58"}}),a.createElement("span",{className:i,style:{background:"#fbbe3c"}}),a.createElement("span",{className:i,style:{background:"#58cb42"}}))),a.createElement("div",{className:d},t))}},5024:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>p,contentTitle:()=>i,default:()=>s,frontMatter:()=>l,metadata:()=>d,toc:()=>c});var a=n(7462),r=(n(7294),n(3905)),o=n(5163);const l={},i="Booster CLI",d={unversionedId:"booster-cli",id:"booster-cli",title:"Booster CLI",description:"Booster CLI is a command line interface that helps you to create, develop, and deploy your Booster applications. It is built with Node.js and published to NPM through the package @boostercloud/cli . You can install it using any compatible package manager. If you want to contribute to the project, you will also need to clone the GitHub repository and compile the source code.",source:"@site/docs/05_booster-cli.mdx",sourceDirName:".",slug:"/booster-cli",permalink:"/booster-cli",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/05_booster-cli.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",sidebarPosition:5,frontMatter:{},sidebar:"docs",previous:{title:"GraphQL API",permalink:"/graphql"},next:{title:"Going deeper with Booster",permalink:"/category/going-deeper-with-booster"}},p={},c=[{value:"Installation",id:"installation",level:2},{value:"Usage",id:"usage",level:2},{value:"Command Overview",id:"command-overview",level:2}],m={toc:c};function s(e){let{components:t,...n}=e;return(0,r.kt)("wrapper",(0,a.Z)({},m,n,{components:t,mdxType:"MDXLayout"}),(0,r.kt)("h1",{id:"booster-cli"},"Booster CLI"),(0,r.kt)("p",null,"Booster CLI is a command line interface that helps you to create, develop, and deploy your Booster applications. It is built with Node.js and published to NPM through the package ",(0,r.kt)("inlineCode",{parentName:"p"},"@boostercloud/cli")," . You can install it using any compatible package manager. If you want to contribute to the project, you will also need to clone the GitHub repository and compile the source code."),(0,r.kt)("h2",{id:"installation"},"Installation"),(0,r.kt)("p",null,"The preferred way to install the Booster CLI is through NPM. You can install it following the instructions in the ",(0,r.kt)("a",{parentName:"p",href:"https://nodejs.org/en/download/"},"Node.js website"),"."),(0,r.kt)("p",null,"Once you have NPM installed, you can install the Booster CLI by running this command:"),(0,r.kt)(o.Z,{mdxType:"TerminalWindow"},(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},"npm install -g @boostercloud/cli\n"))),(0,r.kt)("h2",{id:"usage"},"Usage"),(0,r.kt)("p",null,"Once the installation is finished, you will have the ",(0,r.kt)("inlineCode",{parentName:"p"},"boost")," command available in your terminal. You can run it to see the help message."),(0,r.kt)("admonition",{type:"tip"},(0,r.kt)("p",{parentName:"admonition"},"You can also run ",(0,r.kt)("inlineCode",{parentName:"p"},"boost --help")," to get the same output.")),(0,r.kt)("h2",{id:"command-overview"},"Command Overview"),(0,r.kt)("table",null,(0,r.kt)("thead",{parentName:"table"},(0,r.kt)("tr",{parentName:"thead"},(0,r.kt)("th",{parentName:"tr",align:null},"Command"),(0,r.kt)("th",{parentName:"tr",align:null},"Description"))),(0,r.kt)("tbody",{parentName:"table"},(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"#new"},(0,r.kt)("inlineCode",{parentName:"a"},"new:project"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new Booster project in a new directory")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/architecture/command#creating-a-command"},(0,r.kt)("inlineCode",{parentName:"a"},"new:command"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new command in the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/architecture/entity#creating-an-entity"},(0,r.kt)("inlineCode",{parentName:"a"},"new:entity"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new entity in the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/architecture/event#creating-an-event"},(0,r.kt)("inlineCode",{parentName:"a"},"new:event"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new event in the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/architecture/event-handler#creating-an-event-handler"},(0,r.kt)("inlineCode",{parentName:"a"},"new:event-handler"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new event handler in the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/architecture/read-model#creating-a-read-model"},(0,r.kt)("inlineCode",{parentName:"a"},"new:read-model"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new read model in the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/features/schedule-actions#creating-a-scheduled-command"},(0,r.kt)("inlineCode",{parentName:"a"},"new:scheduled-command"))),(0,r.kt)("td",{parentName:"tr",align:null},"Creates a new scheduled command in the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null}),(0,r.kt)("td",{parentName:"tr",align:null})),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/getting-started/coding#6-deployment"},(0,r.kt)("inlineCode",{parentName:"a"},"start -e "))),(0,r.kt)("td",{parentName:"tr",align:null},"Starts the project in development mode")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/getting-started/coding#6-deployment"},(0,r.kt)("inlineCode",{parentName:"a"},"build"))),(0,r.kt)("td",{parentName:"tr",align:null},"Builds the project")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("a",{parentName:"td",href:"/getting-started/coding#6-deployment"},(0,r.kt)("inlineCode",{parentName:"a"},"deploy -e "))),(0,r.kt)("td",{parentName:"tr",align:null},"Deploys the project to the cloud")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null},(0,r.kt)("inlineCode",{parentName:"td"},"nuke")),(0,r.kt)("td",{parentName:"tr",align:null},"Deletes all the resources created by the deploy command")),(0,r.kt)("tr",{parentName:"tbody"},(0,r.kt)("td",{parentName:"tr",align:null}),(0,r.kt)("td",{parentName:"tr",align:null})))))}s.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/192d5973.222c2f97.js b/assets/js/192d5973.66207475.js similarity index 58% rename from assets/js/192d5973.222c2f97.js rename to assets/js/192d5973.66207475.js index 08aece2be..ea43454d6 100644 --- a/assets/js/192d5973.222c2f97.js +++ b/assets/js/192d5973.66207475.js @@ -1 +1 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[1300],{3905:(e,t,r)=>{r.d(t,{Zo:()=>l,kt:()=>f});var o=r(7294);function n(e,t,r){return t in e?Object.defineProperty(e,t,{value:r,enumerable:!0,configurable:!0,writable:!0}):e[t]=r,e}function a(e,t){var r=Object.keys(e);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);t&&(o=o.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),r.push.apply(r,o)}return r}function s(e){for(var t=1;t=0||(n[r]=e[r]);return n}(e,t);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);for(o=0;o=0||Object.prototype.propertyIsEnumerable.call(e,r)&&(n[r]=e[r])}return n}var c=o.createContext({}),p=function(e){var t=o.useContext(c),r=t;return e&&(r="function"==typeof e?e(t):s(s({},t),e)),r},l=function(e){var t=p(e.components);return o.createElement(c.Provider,{value:t},e.children)},u={inlineCode:"code",wrapper:function(e){var t=e.children;return o.createElement(o.Fragment,{},t)}},d=o.forwardRef((function(e,t){var r=e.components,n=e.mdxType,a=e.originalType,c=e.parentName,l=i(e,["components","mdxType","originalType","parentName"]),d=p(r),f=n,m=d["".concat(c,".").concat(f)]||d[f]||u[f]||a;return r?o.createElement(m,s(s({ref:t},l),{},{components:r})):o.createElement(m,s({ref:t},l))}));function f(e,t){var r=arguments,n=t&&t.mdxType;if("string"==typeof e||n){var a=r.length,s=new Array(a);s[0]=d;var i={};for(var c in t)hasOwnProperty.call(t,c)&&(i[c]=t[c]);i.originalType=e,i.mdxType="string"==typeof e?e:n,s[1]=i;for(var p=2;p{r.r(t),r.d(t,{assets:()=>c,contentTitle:()=>s,default:()=>u,frontMatter:()=>a,metadata:()=>i,toc:()=>p});var o=r(7462),n=(r(7294),r(3905));const a={},s="Static Sites Rocket",i={unversionedId:"going-deeper/rockets/rocket-static-sites",id:"going-deeper/rockets/rocket-static-sites",title:"Static Sites Rocket",description:"This package is a configurable Booster rocket to add static site deployment to your Booster applications. It uploads your root.",source:"@site/docs/10_going-deeper/rockets/rocket-static-sites.md",sourceDirName:"10_going-deeper/rockets",slug:"/going-deeper/rockets/rocket-static-sites",permalink:"/going-deeper/rockets/rocket-static-sites",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/10_going-deeper/rockets/rocket-static-sites.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",frontMatter:{},sidebar:"docs",previous:{title:"Backup Booster Rocket",permalink:"/going-deeper/rockets/rocket-backup-booster"},next:{title:"Webhook Rocket",permalink:"/going-deeper/rockets/rocket-webhook"}},c={},p=[{value:"Usage",id:"usage",level:2}],l={toc:p};function u(e){let{components:t,...r}=e;return(0,n.kt)("wrapper",(0,o.Z)({},l,r,{components:t,mdxType:"MDXLayout"}),(0,n.kt)("h1",{id:"static-sites-rocket"},"Static Sites Rocket"),(0,n.kt)("p",null,"This package is a configurable Booster rocket to add static site deployment to your Booster applications. It uploads your root."),(0,n.kt)("admonition",{type:"info"},(0,n.kt)("p",{parentName:"admonition"},(0,n.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/rocket-static-sites-aws-infrastructure"},"GitHub Repo"))),(0,n.kt)("h2",{id:"usage"},"Usage"),(0,n.kt)("p",null," Install this package as a dev dependency in your Booster project (It's a dev dependency because it's only used during deployment, but we don't want this code to be uploaded to the project lambdas)"),(0,n.kt)("pre",null,(0,n.kt)("code",{parentName:"pre",className:"language-bash"},"npm install --save-dev @boostercloud/rocket-static-sites-aws-infrastructure\n")),(0,n.kt)("p",null," In your Booster config file, pass a RocketDescriptor in the config.rockets array to configuring the static site rocket:"),(0,n.kt)("pre",null,(0,n.kt)("code",{parentName:"pre",className:"language-typescript"},"import { Booster } from '@boostercloud/framework-core'\nimport { BoosterConfig } from '@boostercloud/framework-types'\n\nBooster.configure('development', (config: BoosterConfig): void => {\n config.appName = 'my-store'\n config.rockets = [\n {\n packageName: '@boostercloud/rocket-static-sites-aws-infrastructure', \n parameters: {\n bucketName: 'test-bucket-name', // Required\n rootPath: './frontend/dist', // Defaults to ./public\n indexFile: 'main.html', // File to render when users access the CLoudFormation URL. Defaults to index.html\n errorFile: 'error.html', // File to render when there's an error. Defaults to 404.html\n }\n },\n ]\n})\n")))}u.isMDXComponent=!0}}]); \ No newline at end of file +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[1300],{3905:(e,t,r)=>{r.d(t,{Zo:()=>l,kt:()=>f});var o=r(7294);function n(e,t,r){return t in e?Object.defineProperty(e,t,{value:r,enumerable:!0,configurable:!0,writable:!0}):e[t]=r,e}function a(e,t){var r=Object.keys(e);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);t&&(o=o.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),r.push.apply(r,o)}return r}function s(e){for(var t=1;t=0||(n[r]=e[r]);return n}(e,t);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);for(o=0;o=0||Object.prototype.propertyIsEnumerable.call(e,r)&&(n[r]=e[r])}return n}var c=o.createContext({}),p=function(e){var t=o.useContext(c),r=t;return e&&(r="function"==typeof e?e(t):s(s({},t),e)),r},l=function(e){var t=p(e.components);return o.createElement(c.Provider,{value:t},e.children)},u={inlineCode:"code",wrapper:function(e){var t=e.children;return o.createElement(o.Fragment,{},t)}},d=o.forwardRef((function(e,t){var r=e.components,n=e.mdxType,a=e.originalType,c=e.parentName,l=i(e,["components","mdxType","originalType","parentName"]),d=p(r),f=n,m=d["".concat(c,".").concat(f)]||d[f]||u[f]||a;return r?o.createElement(m,s(s({ref:t},l),{},{components:r})):o.createElement(m,s({ref:t},l))}));function f(e,t){var r=arguments,n=t&&t.mdxType;if("string"==typeof e||n){var a=r.length,s=new Array(a);s[0]=d;var i={};for(var c in t)hasOwnProperty.call(t,c)&&(i[c]=t[c]);i.originalType=e,i.mdxType="string"==typeof e?e:n,s[1]=i;for(var p=2;p{r.r(t),r.d(t,{assets:()=>c,contentTitle:()=>s,default:()=>u,frontMatter:()=>a,metadata:()=>i,toc:()=>p});var o=r(7462),n=(r(7294),r(3905));const a={},s="Static Sites Rocket",i={unversionedId:"going-deeper/rockets/rocket-static-sites",id:"going-deeper/rockets/rocket-static-sites",title:"Static Sites Rocket",description:"This package is a configurable Booster rocket to add static site deployment to your Booster applications. It uploads your root.",source:"@site/docs/10_going-deeper/rockets/rocket-static-sites.md",sourceDirName:"10_going-deeper/rockets",slug:"/going-deeper/rockets/rocket-static-sites",permalink:"/going-deeper/rockets/rocket-static-sites",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/10_going-deeper/rockets/rocket-static-sites.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",frontMatter:{},sidebar:"docs",previous:{title:"Backup Booster Rocket",permalink:"/going-deeper/rockets/rocket-backup-booster"},next:{title:"Webhook Rocket",permalink:"/going-deeper/rockets/rocket-webhook"}},c={},p=[{value:"Usage",id:"usage",level:2}],l={toc:p};function u(e){let{components:t,...r}=e;return(0,n.kt)("wrapper",(0,o.Z)({},l,r,{components:t,mdxType:"MDXLayout"}),(0,n.kt)("h1",{id:"static-sites-rocket"},"Static Sites Rocket"),(0,n.kt)("p",null,"This package is a configurable Booster rocket to add static site deployment to your Booster applications. It uploads your root."),(0,n.kt)("admonition",{type:"info"},(0,n.kt)("p",{parentName:"admonition"},(0,n.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/rocket-static-sites-aws-infrastructure"},"GitHub Repo"))),(0,n.kt)("h2",{id:"usage"},"Usage"),(0,n.kt)("p",null," Install this package as a dev dependency in your Booster project (It's a dev dependency because it's only used during deployment, but we don't want this code to be uploaded to the project lambdas)"),(0,n.kt)("pre",null,(0,n.kt)("code",{parentName:"pre",className:"language-bash"},"npm install --save-dev @boostercloud/rocket-static-sites-aws-infrastructure\n")),(0,n.kt)("p",null," In your Booster config file, pass a RocketDescriptor in the config.rockets array to configuring the static site rocket:"),(0,n.kt)("pre",null,(0,n.kt)("code",{parentName:"pre",className:"language-typescript"},"import { Booster } from '@boostercloud/framework-core'\nimport { BoosterConfig } from '@boostercloud/framework-types'\n\nBooster.configure('development', (config: BoosterConfig): void => {\n config.appName = 'my-store'\n config.rockets = [\n {\n packageName: '@boostercloud/rocket-static-sites-aws-infrastructure', \n parameters: {\n bucketName: 'test-bucket-name', // Required\n rootPath: './frontend/dist', // Defaults to ./public\n indexFile: 'main.html', // File to render when users access the CLoudFormation URL. Defaults to index.html\n errorFile: 'error.html', // File to render when there's an error. Defaults to 404.html\n }\n },\n ]\n})\n")))}u.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/1b08e8f8.53f180de.js b/assets/js/1b08e8f8.53f180de.js new file mode 100644 index 000000000..22d4e62cb --- /dev/null +++ b/assets/js/1b08e8f8.53f180de.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[1588],{3905:(e,t,n)=>{n.d(t,{Zo:()=>d,kt:()=>m});var o=n(7294);function r(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function a(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);t&&(o=o.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,o)}return n}function l(e){for(var t=1;t=0||(r[n]=e[n]);return r}(e,t);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);for(o=0;o=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(r[n]=e[n])}return r}var s=o.createContext({}),c=function(e){var t=o.useContext(s),n=t;return e&&(n="function"==typeof e?e(t):l(l({},t),e)),n},d=function(e){var t=c(e.components);return o.createElement(s.Provider,{value:t},e.children)},u={inlineCode:"code",wrapper:function(e){var t=e.children;return o.createElement(o.Fragment,{},t)}},p=o.forwardRef((function(e,t){var n=e.components,r=e.mdxType,a=e.originalType,s=e.parentName,d=i(e,["components","mdxType","originalType","parentName"]),p=c(n),m=r,f=p["".concat(s,".").concat(m)]||p[m]||u[m]||a;return n?o.createElement(f,l(l({ref:t},d),{},{components:n})):o.createElement(f,l({ref:t},d))}));function m(e,t){var n=arguments,r=t&&t.mdxType;if("string"==typeof e||r){var a=n.length,l=new Array(a);l[0]=p;var i={};for(var s in t)hasOwnProperty.call(t,s)&&(i[s]=t[s]);i.originalType=e,i.mdxType="string"==typeof e?e:r,l[1]=i;for(var c=2;c{n.d(t,{Z:()=>l});var o=n(7294),r=n(6010);const a="tabItem_Ymn6";function l(e){let{children:t,hidden:n,className:l}=e;return o.createElement("div",{role:"tabpanel",className:(0,r.Z)(a,l),hidden:n},t)}},4866:(e,t,n)=>{n.d(t,{Z:()=>w});var o=n(7462),r=n(7294),a=n(6010),l=n(2466),i=n(6550),s=n(1980),c=n(7392),d=n(12);function u(e){return function(e){var t;return(null==(t=r.Children.map(e,(e=>{if(!e||(0,r.isValidElement)(e)&&function(e){const{props:t}=e;return!!t&&"object"==typeof t&&"value"in t}(e))return e;throw new Error(`Docusaurus error: Bad child <${"string"==typeof e.type?e.type:e.type.name}>: all children of the component should be , and every should have a unique "value" prop.`)})))?void 0:t.filter(Boolean))??[]}(e).map((e=>{let{props:{value:t,label:n,attributes:o,default:r}}=e;return{value:t,label:n,attributes:o,default:r}}))}function p(e){const{values:t,children:n}=e;return(0,r.useMemo)((()=>{const e=t??u(n);return function(e){const t=(0,c.l)(e,((e,t)=>e.value===t.value));if(t.length>0)throw new Error(`Docusaurus error: Duplicate values "${t.map((e=>e.value)).join(", ")}" found in . Every value needs to be unique.`)}(e),e}),[t,n])}function m(e){let{value:t,tabValues:n}=e;return n.some((e=>e.value===t))}function f(e){let{queryString:t=!1,groupId:n}=e;const o=(0,i.k6)(),a=function(e){let{queryString:t=!1,groupId:n}=e;if("string"==typeof t)return t;if(!1===t)return null;if(!0===t&&!n)throw new Error('Docusaurus error: The component groupId prop is required if queryString=true, because this value is used as the search param name. You can also provide an explicit value such as queryString="my-search-param".');return n??null}({queryString:t,groupId:n});return[(0,s._X)(a),(0,r.useCallback)((e=>{if(!a)return;const t=new URLSearchParams(o.location.search);t.set(a,e),o.replace({...o.location,search:t.toString()})}),[a,o])]}function g(e){const{defaultValue:t,queryString:n=!1,groupId:o}=e,a=p(e),[l,i]=(0,r.useState)((()=>function(e){let{defaultValue:t,tabValues:n}=e;if(0===n.length)throw new Error("Docusaurus error: the component requires at least one children component");if(t){if(!m({value:t,tabValues:n}))throw new Error(`Docusaurus error: The has a defaultValue "${t}" but none of its children has the corresponding value. Available values are: ${n.map((e=>e.value)).join(", ")}. If you intend to show no default tab, use defaultValue={null} instead.`);return t}const o=n.find((e=>e.default))??n[0];if(!o)throw new Error("Unexpected error: 0 tabValues");return o.value}({defaultValue:t,tabValues:a}))),[s,c]=f({queryString:n,groupId:o}),[u,g]=function(e){let{groupId:t}=e;const n=function(e){return e?`docusaurus.tab.${e}`:null}(t),[o,a]=(0,d.Nk)(n);return[o,(0,r.useCallback)((e=>{n&&a.set(e)}),[n,a])]}({groupId:o}),k=(()=>{const e=s??u;return m({value:e,tabValues:a})?e:null})();(0,r.useLayoutEffect)((()=>{k&&i(k)}),[k]);return{selectedValue:l,selectValue:(0,r.useCallback)((e=>{if(!m({value:e,tabValues:a}))throw new Error(`Can't select invalid tab value=${e}`);i(e),c(e),g(e)}),[c,g,a]),tabValues:a}}var k=n(2389);const y="tabList__CuJ",N="tabItem_LNqP";function h(e){let{className:t,block:n,selectedValue:i,selectValue:s,tabValues:c}=e;const d=[],{blockElementScrollPositionUntilNextRender:u}=(0,l.o5)(),p=e=>{const t=e.currentTarget,n=d.indexOf(t),o=c[n].value;o!==i&&(u(t),s(o))},m=e=>{var t;let n=null;switch(e.key){case"Enter":p(e);break;case"ArrowRight":{const t=d.indexOf(e.currentTarget)+1;n=d[t]??d[0];break}case"ArrowLeft":{const t=d.indexOf(e.currentTarget)-1;n=d[t]??d[d.length-1];break}}null==(t=n)||t.focus()};return r.createElement("ul",{role:"tablist","aria-orientation":"horizontal",className:(0,a.Z)("tabs",{"tabs--block":n},t)},c.map((e=>{let{value:t,label:n,attributes:l}=e;return r.createElement("li",(0,o.Z)({role:"tab",tabIndex:i===t?0:-1,"aria-selected":i===t,key:t,ref:e=>d.push(e),onKeyDown:m,onClick:p},l,{className:(0,a.Z)("tabs__item",N,null==l?void 0:l.className,{"tabs__item--active":i===t})}),n??t)})))}function b(e){let{lazy:t,children:n,selectedValue:o}=e;const a=(Array.isArray(n)?n:[n]).filter(Boolean);if(t){const e=a.find((e=>e.props.value===o));return e?(0,r.cloneElement)(e,{className:"margin-top--md"}):null}return r.createElement("div",{className:"margin-top--md"},a.map(((e,t)=>(0,r.cloneElement)(e,{key:t,hidden:e.props.value!==o}))))}function v(e){const t=g(e);return r.createElement("div",{className:(0,a.Z)("tabs-container",y)},r.createElement(h,(0,o.Z)({},e,t)),r.createElement(b,(0,o.Z)({},e,t)))}function w(e){const t=(0,k.Z)();return r.createElement(v,(0,o.Z)({key:String(t)},e))}},8379:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>d,contentTitle:()=>s,default:()=>m,frontMatter:()=>i,metadata:()=>c,toc:()=>u});var o=n(7462),r=(n(7294),n(3905)),a=n(5162),l=n(4866);const i={},s="File Uploads Rocket",c={unversionedId:"going-deeper/rockets/rocket-file-uploads",id:"going-deeper/rockets/rocket-file-uploads",title:"File Uploads Rocket",description:"This package is a configurable rocket to add a storage API to your Booster applications.",source:"@site/docs/10_going-deeper/rockets/rocket-file-uploads.md",sourceDirName:"10_going-deeper/rockets",slug:"/going-deeper/rockets/rocket-file-uploads",permalink:"/going-deeper/rockets/rocket-file-uploads",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/10_going-deeper/rockets/rocket-file-uploads.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",frontMatter:{},sidebar:"docs",previous:{title:"Extending Booster with Rockets!",permalink:"/going-deeper/rockets"},next:{title:"Backup Booster Rocket",permalink:"/going-deeper/rockets/rocket-backup-booster"}},d={},u=[{value:"Supported Providers",id:"supported-providers",level:2},{value:"Overview",id:"overview",level:2},{value:"Usage",id:"usage",level:2},{value:"Rocket Methods Usage",id:"rocket-methods-usage",level:2},{value:"Azure Roles",id:"azure-roles",level:2},{value:"Rocket Methods Usage",id:"rocket-methods-usage-1",level:2},{value:"Rocket Methods Usage",id:"rocket-methods-usage-2",level:2},{value:"Security",id:"security",level:2},{value:"Events",id:"events",level:2},{value:"TODOs",id:"todos",level:2}],p={toc:u};function m(e){let{components:t,...n}=e;return(0,r.kt)("wrapper",(0,o.Z)({},p,n,{components:t,mdxType:"MDXLayout"}),(0,r.kt)("h1",{id:"file-uploads-rocket"},"File Uploads Rocket"),(0,r.kt)("p",null,"This package is a configurable rocket to add a storage API to your Booster applications."),(0,r.kt)("admonition",{type:"info"},(0,r.kt)("p",{parentName:"admonition"},(0,r.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/rocket-file-uploads"},"GitHub Repo"))),(0,r.kt)("h2",{id:"supported-providers"},"Supported Providers"),(0,r.kt)("ul",null,(0,r.kt)("li",{parentName:"ul"},"Azure Provider"),(0,r.kt)("li",{parentName:"ul"},"AWS Provider"),(0,r.kt)("li",{parentName:"ul"},"Local Provider")),(0,r.kt)("h2",{id:"overview"},"Overview"),(0,r.kt)("p",null,"This rocket provides some methods to access files stores in your cloud provider:"),(0,r.kt)("ul",null,(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"presignedPut"),": Returns a presigned put url and the necessary form params. With this url files can be uploaded directly to your provider."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"presignedGet"),": Returns a presigned get url to download a file. With this url files can be downloaded directly from your provider."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"list"),": Returns a list of files stored in the provider."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"deleteFile"),": Removes a file from a directory (only supported in AWS at the moment).")),(0,r.kt)("p",null,"These methods may be used from a Command in your project secured via JWT Token.\nThis rocket also provides a Booster Event each time a file is uploaded."),(0,r.kt)("h2",{id:"usage"},"Usage"),(0,r.kt)(l.Z,{groupId:"providers-usage",mdxType:"Tabs"},(0,r.kt)(a.Z,{value:"azure-provider",label:"Azure Provider",default:!0,mdxType:"TabItem"},(0,r.kt)("p",null,"Install needed dependency packages:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},"npm install --save @boostercloud/rocket-file-uploads-core @boostercloud/rocket-file-uploads-types\nnpm install --save @boostercloud/rocket-file-uploads-azure\n")),(0,r.kt)("p",null,"Also, you will need a devDependency in your project:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},"npm install --save-dev @boostercloud/rocket-file-uploads-azure-infrastructure\n")),(0,r.kt)("p",null,"In your Booster config file, configure your BoosterRocketFiles:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/config/config.ts"',title:'"src/config/config.ts"'},"import { Booster } from '@boostercloud/framework-core'\nimport { BoosterConfig } from '@boostercloud/framework-types'\nimport { BoosterRocketFiles } from '@boostercloud/rocket-file-uploads-core'\nimport { RocketFilesUserConfiguration } from '@boostercloud/rocket-file-uploads-types'\n\nconst rocketFilesConfigurationDefault: RocketFilesUserConfiguration = {\n storageName: 'STORAGE_NAME',\n containerName: 'CONTAINER_NAME',\n directories: ['DIRECTORY_1', 'DIRECTORY_2'],\n}\n\nconst rocketFilesConfigurationCms: RocketFilesUserConfiguration = {\n storageName: 'cmsst',\n containerName: 'rocketfiles',\n directories: ['cms1', 'cms2'],\n}\n\nBooster.configure('production', (config: BoosterConfig): void => {\n config.appName = 'TEST_APP_NAME'\n config.providerPackage = '@boostercloud/framework-provider-azure'\n config.rockets = [\n new BoosterRocketFiles(config, [rocketFilesConfigurationDefault, rocketFilesConfigurationCms]).rocketForAzure(),\n ]\n})\n\n")),(0,r.kt)("admonition",{type:"info"},(0,r.kt)("p",{parentName:"admonition"},"Available parameters are:"),(0,r.kt)("ul",{parentName:"admonition"},(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"storageName"),": Name of the storage repository."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"containerName"),": Directories container."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"directories"),": A list of folders where the files will be stored.")),(0,r.kt)("hr",{parentName:"admonition"}),(0,r.kt)("p",{parentName:"admonition"},"The structure created will be:"),(0,r.kt)("pre",{parentName:"admonition"},(0,r.kt)("code",{parentName:"pre",className:"language-text"},"\u251c\u2500\u2500 storageName\n\u2502 \u251c\u2500\u2500 containerName\n\u2502 \u2502 \u251c\u2500\u2500 directory\n")),(0,r.kt)("p",{parentName:"admonition"},(0,r.kt)("strong",{parentName:"p"},"NOTE:")," Azure Provider will use ",(0,r.kt)("inlineCode",{parentName:"p"},"storageName")," as the Storage Account Name.")),(0,r.kt)("h2",{id:"rocket-methods-usage"},"Rocket Methods Usage"),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Presigned Put"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"presignedPut")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory and filename you want to upload on the storage."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-put.ts"',title:'"src/commands/file-upload-put.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadPut {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadPut, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.presignedPut(command.directory, command.fileName)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadPut(input: {\n storageName: "clientst",\n directory: "client1",\n fileName: "myfile.txt"\n }\n )\n}\n')),(0,r.kt)("p",null,"Azure Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadPut": "https://clientst.blob.core.windows.net/rocketfiles/client1/myfile.txt?"\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Presigned Get"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"presignedGet")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory and filename you want to get on the storage."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-get.ts"',title:'"src/commands/file-upload-get.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadGet {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadGet, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.presignedGet(command.directory, command.fileName)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadGet(input: {\n storageName: "clientst",\n directory: "client1",\n fileName: "myfile.txt"\n }\n )\n}\n')),(0,r.kt)("p",null,"Azure Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadGet": "https://clientst.blob.core.windows.net/rocketfiles/folder01%2Fmyfile.txt?"\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"List"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"list")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory you want to get the info and return the formatted results."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-list.ts"',title:'"src/commands/file-upload-list.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\nimport { ListItem } from '@boostercloud/rocket-file-uploads-types'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadList {\n public constructor(readonly directory: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadList, register: Register): Promise> {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.list(command.directory)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadList(input: {\n storageName: "clientst",\n directory: "client1"\n }\n )\n}\n')),(0,r.kt)("p",null,"Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadList": [\n {\n "name": "client1/myfile.txt",\n "properties": {\n "createdOn": "2022-10-26T05:40:47.000Z",\n "lastModified": "2022-10-26T05:40:47.000Z",\n "contentLength": 6,\n "contentType": "text/plain"\n }\n }\n ]\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Delete File"),"Currently, the option to delete a file is only available on AWS. If this is a feature you were looking for, please let us know on Discord. Alternatively, you can implement this feature and submit a pull request on GitHub for this Rocket!"),(0,r.kt)("h2",{id:"azure-roles"},"Azure Roles"),(0,r.kt)("admonition",{type:"info"},(0,r.kt)("p",{parentName:"admonition"},"Starting at version ",(0,r.kt)("strong",{parentName:"p"},"0.31.0")," this Rocket use Managed Identities instead of Connection Strings. Please, check that you have the required permissions to assign roles ",(0,r.kt)("a",{parentName:"p",href:"https://learn.microsoft.com/en-us/azure/role-based-access-control/role-assignments-portal-managed-identity#prerequisites"},"https://learn.microsoft.com/en-us/azure/role-based-access-control/role-assignments-portal-managed-identity#prerequisites"))),(0,r.kt)("p",null,"For uploading files to Azure you need the Storage Blob Data Contributor role. This can be assigned to a user using the portal or with the next scripts:"),(0,r.kt)("p",null,"First, check if you have the correct permissions:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},'ACCOUNT_NAME=""\nCONTAINER_NAME=""\n\n# use this to test if you have the correct permissions\naz storage blob exists --account-name $ACCOUNT_NAME `\n --container-name $CONTAINER_NAME `\n --name blob1.txt --auth-mode login\n')),(0,r.kt)("p",null,"If you don't have it, then run this script as admin:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},'ACCOUNT_NAME=""\nCONTAINER_NAME=""\n\nOBJECT_ID=$(az ad user list --query "[?mailNickname==\'\'].objectId" -o tsv)\nSTORAGE_ID=$(az storage account show -n $ACCOUNT_NAME --query id -o tsv)\n\naz role assignment create \\\n --role "Storage Blob Data Contributor" \\\n --assignee $OBJECT_ID \\\n --scope "$STORAGE_ID/blobServices/default/containers/$CONTAINER_NAME"\n'))),(0,r.kt)(a.Z,{value:"aws-provider",label:"AWS Provider",default:!0,mdxType:"TabItem"},(0,r.kt)("p",null,"Install needed dependency packages:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},"npm install --save @boostercloud/rocket-file-uploads-core @boostercloud/rocket-file-uploads-types\nnpm install --save @boostercloud/rocket-file-uploads-aws\n")),(0,r.kt)("p",null,"Also, you will need a devDependency in your project:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},"npm install --save-dev @boostercloud/rocket-file-uploads-aws-infrastructure\n")),(0,r.kt)("p",null,"In your Booster config file, configure your BoosterRocketFiles:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/config/config.ts"',title:'"src/config/config.ts"'},"import { Booster } from '@boostercloud/framework-core'\nimport { BoosterConfig } from '@boostercloud/framework-types'\nimport { BoosterRocketFiles } from '@boostercloud/rocket-file-uploads-core'\nimport { RocketFilesUserConfiguration } from '@boostercloud/rocket-file-uploads-types'\n\nconst rocketFilesConfigurationDefault: RocketFilesUserConfiguration = {\n storageName: 'STORAGE_NAME',\n containerName: '', // Not used in AWS, you can just pass an empty string\n directories: ['DIRECTORY_1', 'DIRECTORY_2'],\n}\n\nconst rocketFilesConfigurationCms: RocketFilesUserConfiguration = {\n storageName: 'cmsst',\n containerName: '', // Not used in AWS, you can just pass an empty string\n directories: ['cms1', 'cms2'],\n}\n\nBooster.configure('production', (config: BoosterConfig): void => {\n config.appName = 'TEST_APP_NAME'\n config.providerPackage = '@boostercloud/framework-provider-aws'\n config.rockets = [\n new BoosterRocketFiles(config, [rocketFilesConfigurationDefault, rocketFilesConfigurationCms]).rocketForAWS(),\n ]\n})\n")),(0,r.kt)("admonition",{type:"info"},(0,r.kt)("p",{parentName:"admonition"},"Available parameters are:"),(0,r.kt)("ul",{parentName:"admonition"},(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"storageName"),": Name of the storage repository."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"directories"),": A list of folders where the files will be stored.")),(0,r.kt)("hr",{parentName:"admonition"}),(0,r.kt)("p",{parentName:"admonition"},"The structure created will be:"),(0,r.kt)("pre",{parentName:"admonition"},(0,r.kt)("code",{parentName:"pre",className:"language-text"},"\u251c\u2500\u2500 storageName\n\u2502 \u251c\u2500\u2500 directory\n"))),(0,r.kt)("h2",{id:"rocket-methods-usage-1"},"Rocket Methods Usage"),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Presigned Put"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"presignedPut")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory and filename you want to upload on the storage."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-put.ts"',title:'"src/commands/file-upload-put.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadPut {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadPut, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.presignedPut(command.directory, command.fileName) as Promise\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadPut(input: { \n directory: "files", \n fileName: "lol.jpg"\n }) {\n url\n fields\n }\n}\n')),(0,r.kt)("p",null,"AWS Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadPut": {\n "url": "https://s3.eu-west-1.amazonaws.com/myappstorage",\n "fields": {\n "Key": "files/lol.jpg",\n "bucket": "myappstorage",\n "X-Amz-Algorithm": "AWS4-HMAC-SHA256",\n "X-Amz-Credential": "blablabla.../eu-west-1/s3/aws4_request",\n "X-Amz-Date": "20230207T142138Z",\n "X-Amz-Security-Token": "IQoJb3JpZ2... blablabla",\n "Policy": "eyJleHBpcmF0a... blablabla",\n "X-Amz-Signature": "60511... blablabla"\n }\n }\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Presigned Get"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"presignedGet")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory and filename you want to get on the storage."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-get.ts"',title:'"src/commands/file-upload-get.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadGet {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadGet, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.presignedGet(command.directory, command.fileName)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadGet(input: {\n storageName: "clientst",\n directory: "client1",\n fileName: "myfile.txt"\n }\n )\n}\n')),(0,r.kt)("p",null,"AWS Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadGet": "https://myappstorage.s3.eu-west-1.amazonaws.com/client1/myfile.txt?"\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"List"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"list")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory you want to get the info and return the formatted results."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-list.ts"',title:'"src/commands/file-upload-list.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\nimport { ListItem } from '@boostercloud/rocket-file-uploads-types'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadList {\n public constructor(readonly directory: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadList, register: Register): Promise> {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.list(command.directory)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadList(input: {\n storageName: "clientst",\n directory: "client1"\n }\n )\n}\n')),(0,r.kt)("p",null,"Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadList": [\n {\n "name": "client1/myfile.txt",\n "properties": {\n "createdOn": "2022-10-26T05:40:47.000Z",\n "lastModified": "2022-10-26T05:40:47.000Z",\n "contentLength": 6,\n "contentType": "text/plain"\n }\n }\n ]\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Delete File"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"deleteFile")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory and file name you want to delete."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/delete-file.ts"',title:'"src/commands/delete-file.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\nimport { ListItem } from '@boostercloud/rocket-file-uploads-types'\n\n@Command({\n authorize: 'all',\n})\nexport class DeleteFile {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: DeleteFile, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.deleteFile(command.directory, command.fileName)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n DeleteFile(input: {\n storageName: "clientst",\n directory: "client1",\n fileName: "myfile.txt"\n }\n )\n}\n')),(0,r.kt)("p",null,"Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "DeleteFile": true\n }\n}\n')))),(0,r.kt)(a.Z,{value:"local-provider",label:"Local Provider",default:!0,mdxType:"TabItem"},(0,r.kt)("p",null,"Install needed dependency packages:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},"npm install --save @boostercloud/rocket-file-uploads-core @boostercloud/rocket-file-uploads-types\nnpm install --save @boostercloud/rocket-file-uploads-local\n")),(0,r.kt)("p",null,"Also, you will need a devDependency in your project:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre"},"npm install --save-dev @boostercloud/rocket-file-uploads-local-infrastructure\n")),(0,r.kt)("p",null,"In your Booster config file, configure your BoosterRocketFiles:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/config/config.ts"',title:'"src/config/config.ts"'},"import { Booster } from '@boostercloud/framework-core'\nimport { BoosterConfig } from '@boostercloud/framework-types'\nimport { BoosterRocketFiles } from '@boostercloud/rocket-file-uploads-core'\nimport { RocketFilesUserConfiguration } from '@boostercloud/rocket-file-uploads-types'\n\nconst rocketFilesConfigurationDefault: RocketFilesUserConfiguration = {\n storageName: 'STORAGE_NAME',\n containerName: 'CONTAINER_NAME',\n directories: ['DIRECTORY_1', 'DIRECTORY_2'],\n}\n\nconst rocketFilesConfigurationCms: RocketFilesUserConfiguration = {\n storageName: 'cmsst',\n containerName: 'rocketfiles',\n directories: ['cms1', 'cms2'],\n}\n\nBooster.configure('local', (config: BoosterConfig): void => {\n config.appName = 'TEST_APP_NAME'\n config.providerPackage = '@boostercloud/framework-provider-local'\n config.rockets = [\n new BoosterRocketFiles(config, [rocketFilesConfigurationDefault, rocketFilesConfigurationCms]).rocketForLocal(),\n ]\n})\n")),(0,r.kt)("admonition",{type:"info"},(0,r.kt)("p",{parentName:"admonition"},"Available parameters are:"),(0,r.kt)("ul",{parentName:"admonition"},(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"storageName"),": Name of the storage repository."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"containerName"),": Directories container."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"directories"),": A list of folders where the files will be stored.")),(0,r.kt)("hr",{parentName:"admonition"}),(0,r.kt)("p",{parentName:"admonition"},"The structure created will be:"),(0,r.kt)("pre",{parentName:"admonition"},(0,r.kt)("code",{parentName:"pre",className:"language-text"},"\u251c\u2500\u2500 storageName\n\u2502 \u251c\u2500\u2500 containerName\n\u2502 \u2502 \u251c\u2500\u2500 directory\n")),(0,r.kt)("p",{parentName:"admonition"},(0,r.kt)("strong",{parentName:"p"},"NOTE:")," Local Provider will use ",(0,r.kt)("inlineCode",{parentName:"p"},"storageName")," as the root folder name.")),(0,r.kt)("h2",{id:"rocket-methods-usage-2"},"Rocket Methods Usage"),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Presigned Put"),"Create a command in your application and call the `presignedPut` method on the `FileHandler` class with the directory and filename you want to upload on the storage.",(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-put.ts"',title:'"src/commands/file-upload-put.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadPut {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadPut, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.presignedPut(command.directory, command.fileName)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadPut(input: {\n storageName: "clientst",\n directory: "client1",\n fileName: "myfile.txt"\n }\n )\n}\n')),(0,r.kt)("p",null,"Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadPut": "http://localhost:3000/clientst/rocketfiles/client1/myfile.txt"\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Presigned Get"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"presignedGet")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory and filename you want to get on the storage."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-get.ts"',title:'"src/commands/file-upload-get.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadGet {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadGet, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.presignedGet(command.directory, command.fileName)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadGet(input: {\n storageName: "clientst",\n directory: "client1",\n fileName: "myfile.txt"\n }\n )\n}\n')),(0,r.kt)("p",null,"Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadGet": "http://localhost:3000/clientst/rocketfiles/client1/myfile.txt"\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"List"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"list")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory you want to get the info and return the formatted results."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-list.ts"',title:'"src/commands/file-upload-list.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\nimport { ListItem } from '@boostercloud/rocket-file-uploads-types'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadList {\n public constructor(readonly directory: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadList, register: Register): Promise> {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.list(command.directory)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadList(input: {\n storageName: "clientst",\n directory: "client1"\n }\n )\n}\n')),(0,r.kt)("p",null,"Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadList": [\n {\n "name": "client1/myfile.txt",\n "properties": {\n "lastModified": "2022-10-26T10:35:18.905Z"\n }\n }\n ]\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Delete File"),"Currently, the option to delete a file is only available on AWS. If this is a feature you were looking for, please let us know on Discord. Alternatively, you can implement this feature and submit a pull request on GitHub for this Rocket!"),(0,r.kt)("h2",{id:"security"},"Security"),(0,r.kt)("p",null,"Local Provider doesn't check paths. You should check that the directory and files passed as paratemers are valid."))),(0,r.kt)("hr",null),(0,r.kt)("h2",{id:"events"},"Events"),(0,r.kt)("p",null,"For each uploaded file a new event will be automatically generated and properly reduced on the entity ",(0,r.kt)("inlineCode",{parentName:"p"},"UploadedFileEntity"),"."),(0,r.kt)(l.Z,{groupId:"providers-usage",mdxType:"Tabs"},(0,r.kt)(a.Z,{value:"azure-and-aws-provider",label:"Azure & AWS Provider",default:!0,mdxType:"TabItem"},(0,r.kt)("p",null,"The event will look like this:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript"},'{\n "version": 1,\n "kind": "snapshot",\n "superKind": "domain",\n "requestID": "xxx",\n "entityID": "xxxx",\n "entityTypeName": "UploadedFileEntity",\n "typeName": "UploadedFileEntity",\n "value": {\n "id": "xxx",\n "metadata": {\n // A bunch of fields (depending on Azure or AWS)\n }\n },\n "createdAt": "2022-10-26T10:23:36.562Z",\n "snapshottedEventCreatedAt": "2022-10-26T10:23:32.34Z",\n "entityTypeName_entityID_kind": "UploadedFileEntity-xxx-b842-x-8975-xx-snapshot",\n "id": "x-x-x-x-x",\n "_rid": "x==",\n "_self": "dbs/x==/colls/x=/docs/x==/",\n "_etag": "\\"x-x-0500-0000-x\\"",\n "_attachments": "attachments/",\n "_ts": 123456\n}\n'))),(0,r.kt)(a.Z,{value:"local-provider",label:"Local Provider",default:!0,mdxType:"TabItem"},(0,r.kt)("p",null,"The event will look like this:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript"},'{\n "version": 1,\n "kind": "snapshot",\n "superKind": "domain",\n "requestID": "x",\n "entityID": "x",\n "entityTypeName": "UploadedFileEntity",\n "typeName": "UploadedFileEntity",\n "value": {\n "id": "x",\n "metadata": {\n "uri": "http://localhost:3000/clientst/rocketfiles/client1/myfile.txt",\n "name": "client1/myfile.txt"\n }\n },\n "createdAt": "2022-10-26T10:35:18.967Z",\n "snapshottedEventCreatedAt": "2022-10-26T10:35:18.958Z",\n "_id": "lMolccTNJVojXiLz"\n}\n')))),(0,r.kt)("h2",{id:"todos"},"TODOs"),(0,r.kt)("ul",null,(0,r.kt)("li",{parentName:"ul"},"Add file deletion to Azure and Local (only supported in AWS at the moment)."),(0,r.kt)("li",{parentName:"ul"},"Optional storage deletion when unmounting the stack."),(0,r.kt)("li",{parentName:"ul"},"Optional events, in case you don't want to store that information in the events-store."),(0,r.kt)("li",{parentName:"ul"},"When deleting a file, save a deletion event in the events-store. Only uploads are stored at the moment.")))}m.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/1b08e8f8.93115798.js b/assets/js/1b08e8f8.93115798.js deleted file mode 100644 index 85e51bf55..000000000 --- a/assets/js/1b08e8f8.93115798.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[1588],{3905:(e,t,n)=>{n.d(t,{Zo:()=>d,kt:()=>m});var o=n(7294);function r(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function a(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);t&&(o=o.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,o)}return n}function l(e){for(var t=1;t=0||(r[n]=e[n]);return r}(e,t);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);for(o=0;o=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(r[n]=e[n])}return r}var s=o.createContext({}),c=function(e){var t=o.useContext(s),n=t;return e&&(n="function"==typeof e?e(t):l(l({},t),e)),n},d=function(e){var t=c(e.components);return o.createElement(s.Provider,{value:t},e.children)},u={inlineCode:"code",wrapper:function(e){var t=e.children;return o.createElement(o.Fragment,{},t)}},p=o.forwardRef((function(e,t){var n=e.components,r=e.mdxType,a=e.originalType,s=e.parentName,d=i(e,["components","mdxType","originalType","parentName"]),p=c(n),m=r,f=p["".concat(s,".").concat(m)]||p[m]||u[m]||a;return n?o.createElement(f,l(l({ref:t},d),{},{components:n})):o.createElement(f,l({ref:t},d))}));function m(e,t){var n=arguments,r=t&&t.mdxType;if("string"==typeof e||r){var a=n.length,l=new Array(a);l[0]=p;var i={};for(var s in t)hasOwnProperty.call(t,s)&&(i[s]=t[s]);i.originalType=e,i.mdxType="string"==typeof e?e:r,l[1]=i;for(var c=2;c{n.d(t,{Z:()=>l});var o=n(7294),r=n(6010);const a="tabItem_Ymn6";function l(e){let{children:t,hidden:n,className:l}=e;return o.createElement("div",{role:"tabpanel",className:(0,r.Z)(a,l),hidden:n},t)}},4866:(e,t,n)=>{n.d(t,{Z:()=>w});var o=n(7462),r=n(7294),a=n(6010),l=n(2466),i=n(6550),s=n(1980),c=n(7392),d=n(12);function u(e){return function(e){var t;return(null==(t=r.Children.map(e,(e=>{if(!e||(0,r.isValidElement)(e)&&function(e){const{props:t}=e;return!!t&&"object"==typeof t&&"value"in t}(e))return e;throw new Error(`Docusaurus error: Bad child <${"string"==typeof e.type?e.type:e.type.name}>: all children of the component should be , and every should have a unique "value" prop.`)})))?void 0:t.filter(Boolean))??[]}(e).map((e=>{let{props:{value:t,label:n,attributes:o,default:r}}=e;return{value:t,label:n,attributes:o,default:r}}))}function p(e){const{values:t,children:n}=e;return(0,r.useMemo)((()=>{const e=t??u(n);return function(e){const t=(0,c.l)(e,((e,t)=>e.value===t.value));if(t.length>0)throw new Error(`Docusaurus error: Duplicate values "${t.map((e=>e.value)).join(", ")}" found in . Every value needs to be unique.`)}(e),e}),[t,n])}function m(e){let{value:t,tabValues:n}=e;return n.some((e=>e.value===t))}function f(e){let{queryString:t=!1,groupId:n}=e;const o=(0,i.k6)(),a=function(e){let{queryString:t=!1,groupId:n}=e;if("string"==typeof t)return t;if(!1===t)return null;if(!0===t&&!n)throw new Error('Docusaurus error: The component groupId prop is required if queryString=true, because this value is used as the search param name. You can also provide an explicit value such as queryString="my-search-param".');return n??null}({queryString:t,groupId:n});return[(0,s._X)(a),(0,r.useCallback)((e=>{if(!a)return;const t=new URLSearchParams(o.location.search);t.set(a,e),o.replace({...o.location,search:t.toString()})}),[a,o])]}function g(e){const{defaultValue:t,queryString:n=!1,groupId:o}=e,a=p(e),[l,i]=(0,r.useState)((()=>function(e){let{defaultValue:t,tabValues:n}=e;if(0===n.length)throw new Error("Docusaurus error: the component requires at least one children component");if(t){if(!m({value:t,tabValues:n}))throw new Error(`Docusaurus error: The has a defaultValue "${t}" but none of its children has the corresponding value. Available values are: ${n.map((e=>e.value)).join(", ")}. If you intend to show no default tab, use defaultValue={null} instead.`);return t}const o=n.find((e=>e.default))??n[0];if(!o)throw new Error("Unexpected error: 0 tabValues");return o.value}({defaultValue:t,tabValues:a}))),[s,c]=f({queryString:n,groupId:o}),[u,g]=function(e){let{groupId:t}=e;const n=function(e){return e?`docusaurus.tab.${e}`:null}(t),[o,a]=(0,d.Nk)(n);return[o,(0,r.useCallback)((e=>{n&&a.set(e)}),[n,a])]}({groupId:o}),k=(()=>{const e=s??u;return m({value:e,tabValues:a})?e:null})();(0,r.useLayoutEffect)((()=>{k&&i(k)}),[k]);return{selectedValue:l,selectValue:(0,r.useCallback)((e=>{if(!m({value:e,tabValues:a}))throw new Error(`Can't select invalid tab value=${e}`);i(e),c(e),g(e)}),[c,g,a]),tabValues:a}}var k=n(2389);const y="tabList__CuJ",N="tabItem_LNqP";function h(e){let{className:t,block:n,selectedValue:i,selectValue:s,tabValues:c}=e;const d=[],{blockElementScrollPositionUntilNextRender:u}=(0,l.o5)(),p=e=>{const t=e.currentTarget,n=d.indexOf(t),o=c[n].value;o!==i&&(u(t),s(o))},m=e=>{var t;let n=null;switch(e.key){case"Enter":p(e);break;case"ArrowRight":{const t=d.indexOf(e.currentTarget)+1;n=d[t]??d[0];break}case"ArrowLeft":{const t=d.indexOf(e.currentTarget)-1;n=d[t]??d[d.length-1];break}}null==(t=n)||t.focus()};return r.createElement("ul",{role:"tablist","aria-orientation":"horizontal",className:(0,a.Z)("tabs",{"tabs--block":n},t)},c.map((e=>{let{value:t,label:n,attributes:l}=e;return r.createElement("li",(0,o.Z)({role:"tab",tabIndex:i===t?0:-1,"aria-selected":i===t,key:t,ref:e=>d.push(e),onKeyDown:m,onClick:p},l,{className:(0,a.Z)("tabs__item",N,null==l?void 0:l.className,{"tabs__item--active":i===t})}),n??t)})))}function b(e){let{lazy:t,children:n,selectedValue:o}=e;const a=(Array.isArray(n)?n:[n]).filter(Boolean);if(t){const e=a.find((e=>e.props.value===o));return e?(0,r.cloneElement)(e,{className:"margin-top--md"}):null}return r.createElement("div",{className:"margin-top--md"},a.map(((e,t)=>(0,r.cloneElement)(e,{key:t,hidden:e.props.value!==o}))))}function v(e){const t=g(e);return r.createElement("div",{className:(0,a.Z)("tabs-container",y)},r.createElement(h,(0,o.Z)({},e,t)),r.createElement(b,(0,o.Z)({},e,t)))}function w(e){const t=(0,k.Z)();return r.createElement(v,(0,o.Z)({key:String(t)},e))}},8379:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>d,contentTitle:()=>s,default:()=>m,frontMatter:()=>i,metadata:()=>c,toc:()=>u});var o=n(7462),r=(n(7294),n(3905)),a=n(5162),l=n(4866);const i={},s="File Uploads Rocket",c={unversionedId:"going-deeper/rockets/rocket-file-uploads",id:"going-deeper/rockets/rocket-file-uploads",title:"File Uploads Rocket",description:"This package is a configurable rocket to add a storage API to your Booster applications.",source:"@site/docs/10_going-deeper/rockets/rocket-file-uploads.md",sourceDirName:"10_going-deeper/rockets",slug:"/going-deeper/rockets/rocket-file-uploads",permalink:"/going-deeper/rockets/rocket-file-uploads",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/10_going-deeper/rockets/rocket-file-uploads.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",frontMatter:{},sidebar:"docs",previous:{title:"Extending Booster with Rockets!",permalink:"/going-deeper/rockets"},next:{title:"Backup Booster Rocket",permalink:"/going-deeper/rockets/rocket-backup-booster"}},d={},u=[{value:"Supported Providers",id:"supported-providers",level:2},{value:"Overview",id:"overview",level:2},{value:"Usage",id:"usage",level:2},{value:"Rocket Methods Usage",id:"rocket-methods-usage",level:2},{value:"Azure Roles",id:"azure-roles",level:2},{value:"Rocket Methods Usage",id:"rocket-methods-usage-1",level:2},{value:"Rocket Methods Usage",id:"rocket-methods-usage-2",level:2},{value:"Security",id:"security",level:2},{value:"Events",id:"events",level:2},{value:"TODOs",id:"todos",level:2}],p={toc:u};function m(e){let{components:t,...n}=e;return(0,r.kt)("wrapper",(0,o.Z)({},p,n,{components:t,mdxType:"MDXLayout"}),(0,r.kt)("h1",{id:"file-uploads-rocket"},"File Uploads Rocket"),(0,r.kt)("p",null,"This package is a configurable rocket to add a storage API to your Booster applications."),(0,r.kt)("admonition",{type:"info"},(0,r.kt)("p",{parentName:"admonition"},(0,r.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/rocket-file-uploads"},"GitHub Repo"))),(0,r.kt)("h2",{id:"supported-providers"},"Supported Providers"),(0,r.kt)("ul",null,(0,r.kt)("li",{parentName:"ul"},"Azure Provider"),(0,r.kt)("li",{parentName:"ul"},"AWS Provider"),(0,r.kt)("li",{parentName:"ul"},"Local Provider")),(0,r.kt)("h2",{id:"overview"},"Overview"),(0,r.kt)("p",null,"This rocket provides some methods to access files stores in your cloud provider:"),(0,r.kt)("ul",null,(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"presignedPut"),": Returns a presigned put url and the necessary form params. With this url files can be uploaded directly to your provider."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"presignedGet"),": Returns a presigned get url to download a file. With this url files can be downloaded directly from your provider."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"list"),": Returns a list of files stored in the provider."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"deleteFile"),": Removes a file from a directory (only supported in AWS at the moment).")),(0,r.kt)("p",null,"These methods may be used from a Command in your project secured via JWT Token.\nThis rocket also provides a Booster Event each time a file is uploaded."),(0,r.kt)("h2",{id:"usage"},"Usage"),(0,r.kt)(l.Z,{groupId:"providers-usage",mdxType:"Tabs"},(0,r.kt)(a.Z,{value:"azure-provider",label:"Azure Provider",default:!0,mdxType:"TabItem"},(0,r.kt)("p",null,"Install needed dependency packages:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},"npm install --save @boostercloud/rocket-file-uploads-core @boostercloud/rocket-file-uploads-types\nnpm install --save @boostercloud/rocket-file-uploads-azure\n")),(0,r.kt)("p",null,"Also, you will need a devDependency in your project:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},"npm install --save-dev @boostercloud/rocket-file-uploads-azure-infrastructure\n")),(0,r.kt)("p",null,"In your Booster config file, configure your BoosterRocketFiles:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/config/config.ts"',title:'"src/config/config.ts"'},"import { Booster } from '@boostercloud/framework-core'\nimport { BoosterConfig } from '@boostercloud/framework-types'\nimport { BoosterRocketFiles } from '@boostercloud/rocket-file-uploads-core'\nimport { RocketFilesUserConfiguration } from '@boostercloud/rocket-file-uploads-types'\n\nconst rocketFilesConfigurationDefault: RocketFilesUserConfiguration = {\n storageName: 'STORAGE_NAME',\n containerName: 'CONTAINER_NAME',\n directories: ['DIRECTORY_1', 'DIRECTORY_2'],\n}\n\nconst rocketFilesConfigurationCms: RocketFilesUserConfiguration = {\n storageName: 'cmsst',\n containerName: 'rocketfiles',\n directories: ['cms1', 'cms2'],\n}\n\nBooster.configure('production', (config: BoosterConfig): void => {\n config.appName = 'TEST_APP_NAME'\n config.providerPackage = '@boostercloud/framework-provider-azure'\n config.rockets = [\n new BoosterRocketFiles(config, [rocketFilesConfigurationDefault, rocketFilesConfigurationCms]).rocketForAzure(),\n ]\n})\n\n")),(0,r.kt)("admonition",{type:"info"},(0,r.kt)("p",{parentName:"admonition"},"Available parameters are:"),(0,r.kt)("ul",{parentName:"admonition"},(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"storageName"),": Name of the storage repository."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"containerName"),": Directories container."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"directories"),": A list of folders where the files will be stored.")),(0,r.kt)("hr",{parentName:"admonition"}),(0,r.kt)("p",{parentName:"admonition"},"The structure created will be:"),(0,r.kt)("pre",{parentName:"admonition"},(0,r.kt)("code",{parentName:"pre",className:"language-text"},"\u251c\u2500\u2500 storageName\n\u2502 \u251c\u2500\u2500 containerName\n\u2502 \u2502 \u251c\u2500\u2500 directory\n")),(0,r.kt)("p",{parentName:"admonition"},(0,r.kt)("strong",{parentName:"p"},"NOTE:")," Azure Provider will use ",(0,r.kt)("inlineCode",{parentName:"p"},"storageName")," as the Storage Account Name.")),(0,r.kt)("h2",{id:"rocket-methods-usage"},"Rocket Methods Usage"),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Presigned Put"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"presignedPut")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory and filename you want to upload on the storage."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-put.ts"',title:'"src/commands/file-upload-put.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadPut {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadPut, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.presignedPut(command.directory, command.fileName)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadPut(input: {\n storageName: "clientst",\n directory: "client1",\n fileName: "myfile.txt"\n }\n )\n}\n')),(0,r.kt)("p",null,"Azure Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadPut": "https://clientst.blob.core.windows.net/rocketfiles/client1/myfile.txt?"\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Presigned Get"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"presignedGet")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory and filename you want to get on the storage."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-get.ts"',title:'"src/commands/file-upload-get.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadGet {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadGet, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.presignedGet(command.directory, command.fileName)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadGet(input: {\n storageName: "clientst",\n directory: "client1",\n fileName: "myfile.txt"\n }\n )\n}\n')),(0,r.kt)("p",null,"Azure Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadGet": "https://clientst.blob.core.windows.net/rocketfiles/folder01%2Fmyfile.txt?"\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"List"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"list")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory you want to get the info and return the formatted results."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-list.ts"',title:'"src/commands/file-upload-list.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\nimport { ListItem } from '@boostercloud/rocket-file-uploads-types'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadList {\n public constructor(readonly directory: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadList, register: Register): Promise> {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.list(command.directory)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadList(input: {\n storageName: "clientst",\n directory: "client1"\n }\n )\n}\n')),(0,r.kt)("p",null,"Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadList": [\n {\n "name": "client1/myfile.txt",\n "properties": {\n "createdOn": "2022-10-26T05:40:47.000Z",\n "lastModified": "2022-10-26T05:40:47.000Z",\n "contentLength": 6,\n "contentType": "text/plain"\n }\n }\n ]\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Delete File"),"Currently, the option to delete a file is only available on AWS. If this is a feature you were looking for, please let us know on Discord. Alternatively, you can implement this feature and submit a pull request on GitHub for this Rocket!"),(0,r.kt)("h2",{id:"azure-roles"},"Azure Roles"),(0,r.kt)("admonition",{type:"info"},(0,r.kt)("p",{parentName:"admonition"},"Starting at version ",(0,r.kt)("strong",{parentName:"p"},"0.31.0")," this Rocket use Managed Identities instead of Connection Strings. Please, check that you have the required permissions to assign roles ",(0,r.kt)("a",{parentName:"p",href:"https://learn.microsoft.com/en-us/azure/role-based-access-control/role-assignments-portal-managed-identity#prerequisites"},"https://learn.microsoft.com/en-us/azure/role-based-access-control/role-assignments-portal-managed-identity#prerequisites"))),(0,r.kt)("p",null,"For uploading files to Azure you need the Storage Blob Data Contributor role. This can be assigned to a user using the portal or with the next scripts:"),(0,r.kt)("p",null,"First, check if you have the correct permissions:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},'ACCOUNT_NAME=""\nCONTAINER_NAME=""\n\n# use this to test if you have the correct permissions\naz storage blob exists --account-name $ACCOUNT_NAME `\n --container-name $CONTAINER_NAME `\n --name blob1.txt --auth-mode login\n')),(0,r.kt)("p",null,"If you don't have it, then run this script as admin:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},'ACCOUNT_NAME=""\nCONTAINER_NAME=""\n\nOBJECT_ID=$(az ad user list --query "[?mailNickname==\'\'].objectId" -o tsv)\nSTORAGE_ID=$(az storage account show -n $ACCOUNT_NAME --query id -o tsv)\n\naz role assignment create \\\n --role "Storage Blob Data Contributor" \\\n --assignee $OBJECT_ID \\\n --scope "$STORAGE_ID/blobServices/default/containers/$CONTAINER_NAME"\n'))),(0,r.kt)(a.Z,{value:"aws-provider",label:"AWS Provider",default:!0,mdxType:"TabItem"},(0,r.kt)("p",null,"Install needed dependency packages:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},"npm install --save @boostercloud/rocket-file-uploads-core @boostercloud/rocket-file-uploads-types\nnpm install --save @boostercloud/rocket-file-uploads-aws\n")),(0,r.kt)("p",null,"Also, you will need a devDependency in your project:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},"npm install --save-dev @boostercloud/rocket-file-uploads-aws-infrastructure\n")),(0,r.kt)("p",null,"In your Booster config file, configure your BoosterRocketFiles:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/config/config.ts"',title:'"src/config/config.ts"'},"import { Booster } from '@boostercloud/framework-core'\nimport { BoosterConfig } from '@boostercloud/framework-types'\nimport { BoosterRocketFiles } from '@boostercloud/rocket-file-uploads-core'\nimport { RocketFilesUserConfiguration } from '@boostercloud/rocket-file-uploads-types'\n\nconst rocketFilesConfigurationDefault: RocketFilesUserConfiguration = {\n storageName: 'STORAGE_NAME',\n containerName: '', // Not used in AWS, you can just pass an empty string\n directories: ['DIRECTORY_1', 'DIRECTORY_2'],\n}\n\nconst rocketFilesConfigurationCms: RocketFilesUserConfiguration = {\n storageName: 'cmsst',\n containerName: '', // Not used in AWS, you can just pass an empty string\n directories: ['cms1', 'cms2'],\n}\n\nBooster.configure('production', (config: BoosterConfig): void => {\n config.appName = 'TEST_APP_NAME'\n config.providerPackage = '@boostercloud/framework-provider-aws'\n config.rockets = [\n new BoosterRocketFiles(config, [rocketFilesConfigurationDefault, rocketFilesConfigurationCms]).rocketForAWS(),\n ]\n})\n")),(0,r.kt)("admonition",{type:"info"},(0,r.kt)("p",{parentName:"admonition"},"Available parameters are:"),(0,r.kt)("ul",{parentName:"admonition"},(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"storageName"),": Name of the storage repository."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"directories"),": A list of folders where the files will be stored.")),(0,r.kt)("hr",{parentName:"admonition"}),(0,r.kt)("p",{parentName:"admonition"},"The structure created will be:"),(0,r.kt)("pre",{parentName:"admonition"},(0,r.kt)("code",{parentName:"pre",className:"language-text"},"\u251c\u2500\u2500 storageName\n\u2502 \u251c\u2500\u2500 directory\n"))),(0,r.kt)("h2",{id:"rocket-methods-usage-1"},"Rocket Methods Usage"),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Presigned Put"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"presignedPut")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory and filename you want to upload on the storage."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-put.ts"',title:'"src/commands/file-upload-put.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadPut {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadPut, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.presignedPut(command.directory, command.fileName) as Promise\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadPut(input: { \n directory: "files", \n fileName: "lol.jpg"\n }) {\n url\n fields\n }\n}\n')),(0,r.kt)("p",null,"AWS Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadPut": {\n "url": "https://s3.eu-west-1.amazonaws.com/myappstorage",\n "fields": {\n "Key": "files/lol.jpg",\n "bucket": "myappstorage",\n "X-Amz-Algorithm": "AWS4-HMAC-SHA256",\n "X-Amz-Credential": "blablabla.../eu-west-1/s3/aws4_request",\n "X-Amz-Date": "20230207T142138Z",\n "X-Amz-Security-Token": "IQoJb3JpZ2... blablabla",\n "Policy": "eyJleHBpcmF0a... blablabla",\n "X-Amz-Signature": "60511... blablabla"\n }\n }\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Presigned Get"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"presignedGet")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory and filename you want to get on the storage."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-get.ts"',title:'"src/commands/file-upload-get.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadGet {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadGet, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.presignedGet(command.directory, command.fileName)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadGet(input: {\n storageName: "clientst",\n directory: "client1",\n fileName: "myfile.txt"\n }\n )\n}\n')),(0,r.kt)("p",null,"AWS Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadGet": "https://myappstorage.s3.eu-west-1.amazonaws.com/client1/myfile.txt?"\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"List"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"list")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory you want to get the info and return the formatted results."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-list.ts"',title:'"src/commands/file-upload-list.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\nimport { ListItem } from '@boostercloud/rocket-file-uploads-types'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadList {\n public constructor(readonly directory: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadList, register: Register): Promise> {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.list(command.directory)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadList(input: {\n storageName: "clientst",\n directory: "client1"\n }\n )\n}\n')),(0,r.kt)("p",null,"Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadList": [\n {\n "name": "client1/myfile.txt",\n "properties": {\n "createdOn": "2022-10-26T05:40:47.000Z",\n "lastModified": "2022-10-26T05:40:47.000Z",\n "contentLength": 6,\n "contentType": "text/plain"\n }\n }\n ]\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Delete File"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"deleteFile")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory and file name you want to delete."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/delete-file.ts"',title:'"src/commands/delete-file.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\nimport { ListItem } from '@boostercloud/rocket-file-uploads-types'\n\n@Command({\n authorize: 'all',\n})\nexport class DeleteFile {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: DeleteFile, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.deleteFile(command.directory, command.fileName)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n DeleteFile(input: {\n storageName: "clientst",\n directory: "client1",\n fileName: "myfile.txt"\n }\n )\n}\n')),(0,r.kt)("p",null,"Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "DeleteFile": true\n }\n}\n')))),(0,r.kt)(a.Z,{value:"local-provider",label:"Local Provider",default:!0,mdxType:"TabItem"},(0,r.kt)("p",null,"Install needed dependency packages:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-bash"},"npm install --save @boostercloud/rocket-file-uploads-core @boostercloud/rocket-file-uploads-types\nnpm install --save @boostercloud/rocket-file-uploads-local\n")),(0,r.kt)("p",null,"Also, you will need a devDependency in your project:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre"},"npm install --save-dev @boostercloud/rocket-file-uploads-local-infrastructure\n")),(0,r.kt)("p",null,"In your Booster config file, configure your BoosterRocketFiles:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/config/config.ts"',title:'"src/config/config.ts"'},"import { Booster } from '@boostercloud/framework-core'\nimport { BoosterConfig } from '@boostercloud/framework-types'\nimport { BoosterRocketFiles } from '@boostercloud/rocket-file-uploads-core'\nimport { RocketFilesUserConfiguration } from '@boostercloud/rocket-file-uploads-types'\n\nconst rocketFilesConfigurationDefault: RocketFilesUserConfiguration = {\n storageName: 'STORAGE_NAME',\n containerName: 'CONTAINER_NAME',\n directories: ['DIRECTORY_1', 'DIRECTORY_2'],\n}\n\nconst rocketFilesConfigurationCms: RocketFilesUserConfiguration = {\n storageName: 'cmsst',\n containerName: 'rocketfiles',\n directories: ['cms1', 'cms2'],\n}\n\nBooster.configure('local', (config: BoosterConfig): void => {\n config.appName = 'TEST_APP_NAME'\n config.providerPackage = '@boostercloud/framework-provider-local'\n config.rockets = [\n new BoosterRocketFiles(config, [rocketFilesConfigurationDefault, rocketFilesConfigurationCms]).rocketForLocal(),\n ]\n})\n")),(0,r.kt)("admonition",{type:"info"},(0,r.kt)("p",{parentName:"admonition"},"Available parameters are:"),(0,r.kt)("ul",{parentName:"admonition"},(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"storageName"),": Name of the storage repository."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"containerName"),": Directories container."),(0,r.kt)("li",{parentName:"ul"},(0,r.kt)("strong",{parentName:"li"},"directories"),": A list of folders where the files will be stored.")),(0,r.kt)("hr",{parentName:"admonition"}),(0,r.kt)("p",{parentName:"admonition"},"The structure created will be:"),(0,r.kt)("pre",{parentName:"admonition"},(0,r.kt)("code",{parentName:"pre",className:"language-text"},"\u251c\u2500\u2500 storageName\n\u2502 \u251c\u2500\u2500 containerName\n\u2502 \u2502 \u251c\u2500\u2500 directory\n")),(0,r.kt)("p",{parentName:"admonition"},(0,r.kt)("strong",{parentName:"p"},"NOTE:")," Local Provider will use ",(0,r.kt)("inlineCode",{parentName:"p"},"storageName")," as the root folder name.")),(0,r.kt)("h2",{id:"rocket-methods-usage-2"},"Rocket Methods Usage"),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Presigned Put"),"Create a command in your application and call the `presignedPut` method on the `FileHandler` class with the directory and filename you want to upload on the storage.",(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-put.ts"',title:'"src/commands/file-upload-put.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadPut {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadPut, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.presignedPut(command.directory, command.fileName)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadPut(input: {\n storageName: "clientst",\n directory: "client1",\n fileName: "myfile.txt"\n }\n )\n}\n')),(0,r.kt)("p",null,"Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadPut": "http://localhost:3000/clientst/rocketfiles/client1/myfile.txt"\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Presigned Get"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"presignedGet")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory and filename you want to get on the storage."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-get.ts"',title:'"src/commands/file-upload-get.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadGet {\n public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadGet, register: Register): Promise {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.presignedGet(command.directory, command.fileName)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadGet(input: {\n storageName: "clientst",\n directory: "client1",\n fileName: "myfile.txt"\n }\n )\n}\n')),(0,r.kt)("p",null,"Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadGet": "http://localhost:3000/clientst/rocketfiles/client1/myfile.txt"\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"List"),(0,r.kt)("p",null,"Create a command in your application and call the ",(0,r.kt)("inlineCode",{parentName:"p"},"list")," method on the ",(0,r.kt)("inlineCode",{parentName:"p"},"FileHandler")," class with the directory you want to get the info and return the formatted results."),(0,r.kt)("p",null,"The storageName parameter is optional. It will use the first storage if undefined."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/file-upload-list.ts"',title:'"src/commands/file-upload-list.ts"'},"import { Booster, Command } from '@boostercloud/framework-core'\nimport { Register } from '@boostercloud/framework-types'\nimport { FileHandler } from '@boostercloud/rocket-file-uploads-core'\nimport { ListItem } from '@boostercloud/rocket-file-uploads-types'\n\n@Command({\n authorize: 'all',\n})\nexport class FileUploadList {\n public constructor(readonly directory: string, readonly storageName?: string) {}\n\n public static async handle(command: FileUploadList, register: Register): Promise> {\n const boosterConfig = Booster.config\n const fileHandler = new FileHandler(boosterConfig, command.storageName)\n return await fileHandler.list(command.directory)\n }\n}\n")),(0,r.kt)("p",null,"GraphQL Mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'mutation {\n FileUploadList(input: {\n storageName: "clientst",\n directory: "client1"\n }\n )\n}\n')),(0,r.kt)("p",null,"Response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "FileUploadList": [\n {\n "name": "client1/myfile.txt",\n "properties": {\n "lastModified": "2022-10-26T10:35:18.905Z"\n }\n }\n ]\n }\n}\n'))),(0,r.kt)("details",null,(0,r.kt)("summary",null,"Delete File"),"Currently, the option to delete a file is only available on AWS. If this is a feature you were looking for, please let us know on Discord. Alternatively, you can implement this feature and submit a pull request on GitHub for this Rocket!"),(0,r.kt)("h2",{id:"security"},"Security"),(0,r.kt)("p",null,"Local Provider doesn't check paths. You should check that the directory and files passed as paratemers are valid."))),(0,r.kt)("hr",null),(0,r.kt)("h2",{id:"events"},"Events"),(0,r.kt)("p",null,"For each uploaded file a new event will be automatically generated and properly reduced on the entity ",(0,r.kt)("inlineCode",{parentName:"p"},"UploadedFileEntity"),"."),(0,r.kt)(l.Z,{groupId:"providers-usage",mdxType:"Tabs"},(0,r.kt)(a.Z,{value:"azure-and-aws-provider",label:"Azure & AWS Provider",default:!0,mdxType:"TabItem"},(0,r.kt)("p",null,"The event will look like this:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript"},'{\n "version": 1,\n "kind": "snapshot",\n "superKind": "domain",\n "requestID": "xxx",\n "entityID": "xxxx",\n "entityTypeName": "UploadedFileEntity",\n "typeName": "UploadedFileEntity",\n "value": {\n "id": "xxx",\n "metadata": {\n // A bunch of fields (depending on Azure or AWS)\n }\n },\n "createdAt": "2022-10-26T10:23:36.562Z",\n "snapshottedEventCreatedAt": "2022-10-26T10:23:32.34Z",\n "entityTypeName_entityID_kind": "UploadedFileEntity-xxx-b842-x-8975-xx-snapshot",\n "id": "x-x-x-x-x",\n "_rid": "x==",\n "_self": "dbs/x==/colls/x=/docs/x==/",\n "_etag": "\\"x-x-0500-0000-x\\"",\n "_attachments": "attachments/",\n "_ts": 123456\n}\n'))),(0,r.kt)(a.Z,{value:"local-provider",label:"Local Provider",default:!0,mdxType:"TabItem"},(0,r.kt)("p",null,"The event will look like this:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript"},'{\n "version": 1,\n "kind": "snapshot",\n "superKind": "domain",\n "requestID": "x",\n "entityID": "x",\n "entityTypeName": "UploadedFileEntity",\n "typeName": "UploadedFileEntity",\n "value": {\n "id": "x",\n "metadata": {\n "uri": "http://localhost:3000/clientst/rocketfiles/client1/myfile.txt",\n "name": "client1/myfile.txt"\n }\n },\n "createdAt": "2022-10-26T10:35:18.967Z",\n "snapshottedEventCreatedAt": "2022-10-26T10:35:18.958Z",\n "_id": "lMolccTNJVojXiLz"\n}\n')))),(0,r.kt)("h2",{id:"todos"},"TODOs"),(0,r.kt)("ul",null,(0,r.kt)("li",{parentName:"ul"},"Add file deletion to Azure and Local (only supported in AWS at the moment)."),(0,r.kt)("li",{parentName:"ul"},"Optional storage deletion when unmounting the stack."),(0,r.kt)("li",{parentName:"ul"},"Optional events, in case you don't want to store that information in the events-store."),(0,r.kt)("li",{parentName:"ul"},"When deleting a file, save a deletion event in the events-store. Only uploads are stored at the moment.")))}m.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/1efc9436.c0c03859.js b/assets/js/1efc9436.c0c03859.js deleted file mode 100644 index 4ef362713..000000000 --- a/assets/js/1efc9436.c0c03859.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[2126],{3905:(e,t,n)=>{n.d(t,{Zo:()=>d,kt:()=>m});var a=n(7294);function i(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function r(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);t&&(a=a.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,a)}return n}function o(e){for(var t=1;t=0||(i[n]=e[n]);return i}(e,t);if(Object.getOwnPropertySymbols){var r=Object.getOwnPropertySymbols(e);for(a=0;a=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(i[n]=e[n])}return i}var s=a.createContext({}),c=function(e){var t=a.useContext(s),n=t;return e&&(n="function"==typeof e?e(t):o(o({},t),e)),n},d=function(e){var t=c(e.components);return a.createElement(s.Provider,{value:t},e.children)},u={inlineCode:"code",wrapper:function(e){var t=e.children;return a.createElement(a.Fragment,{},t)}},p=a.forwardRef((function(e,t){var n=e.components,i=e.mdxType,r=e.originalType,s=e.parentName,d=l(e,["components","mdxType","originalType","parentName"]),p=c(n),m=i,h=p["".concat(s,".").concat(m)]||p[m]||u[m]||r;return n?a.createElement(h,o(o({ref:t},d),{},{components:n})):a.createElement(h,o({ref:t},d))}));function m(e,t){var n=arguments,i=t&&t.mdxType;if("string"==typeof e||i){var r=n.length,o=new Array(r);o[0]=p;var l={};for(var s in t)hasOwnProperty.call(t,s)&&(l[s]=t[s]);l.originalType=e,l.mdxType="string"==typeof e?e:i,o[1]=l;for(var c=2;c{n.d(t,{Z:()=>c});var a=n(7294);const i="terminalWindow_wGrl",r="terminalWindowHeader_o9Cs",o="buttons_IGLB",l="dot_fGZE",s="terminalWindowBody_tzdS";function c(e){let{children:t}=e;return a.createElement("div",{className:i},a.createElement("div",{className:r},a.createElement("div",{className:o},a.createElement("span",{className:l,style:{background:"#f25f58"}}),a.createElement("span",{className:l,style:{background:"#fbbe3c"}}),a.createElement("span",{className:l,style:{background:"#58cb42"}}))),a.createElement("div",{className:s},t))}},9139:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>c,contentTitle:()=>l,default:()=>p,frontMatter:()=>o,metadata:()=>s,toc:()=>d});var a=n(7462),i=(n(7294),n(3905)),r=n(5163);const o={},l="Entity",s={unversionedId:"architecture/entity",id:"architecture/entity",title:"Entity",description:"If events are the source of truth of your application, entities are the current state of your application. For example, if you have an application that allows users to create bank accounts, the events would be something like AccountCreated, MoneyDeposited, MoneyWithdrawn, etc. But the entities would be the BankAccount themselves, with the current balance, owner, etc.",source:"@site/docs/03_architecture/05_entity.mdx",sourceDirName:"03_architecture",slug:"/architecture/entity",permalink:"/architecture/entity",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/03_architecture/05_entity.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",sidebarPosition:5,frontMatter:{},sidebar:"docs",previous:{title:"Event handler",permalink:"/architecture/event-handler"},next:{title:"Read model",permalink:"/architecture/read-model"}},c={},d=[{value:"Creating entities",id:"creating-entities",level:2},{value:"Declaring an entity",id:"declaring-an-entity",level:2},{value:"The reduce function",id:"the-reduce-function",level:2},{value:"Reducing multiple events",id:"reducing-multiple-events",level:3},{value:"Eventual Consistency",id:"eventual-consistency",level:3},{value:"Entity ID",id:"entity-id",level:2},{value:"Entities naming convention",id:"entities-naming-convention",level:2}],u={toc:d};function p(e){let{components:t,...o}=e;return(0,i.kt)("wrapper",(0,a.Z)({},u,o,{components:t,mdxType:"MDXLayout"}),(0,i.kt)("h1",{id:"entity"},"Entity"),(0,i.kt)("p",null,"If events are the ",(0,i.kt)("em",{parentName:"p"},"source of truth")," of your application, entities are the ",(0,i.kt)("em",{parentName:"p"},"current state")," of your application. For example, if you have an application that allows users to create bank accounts, the events would be something like ",(0,i.kt)("inlineCode",{parentName:"p"},"AccountCreated"),", ",(0,i.kt)("inlineCode",{parentName:"p"},"MoneyDeposited"),", ",(0,i.kt)("inlineCode",{parentName:"p"},"MoneyWithdrawn"),", etc. But the entities would be the ",(0,i.kt)("inlineCode",{parentName:"p"},"BankAccount")," themselves, with the current balance, owner, etc."),(0,i.kt)("p",null,"Entities are created by ",(0,i.kt)("em",{parentName:"p"},"reducing")," the whole event stream. Booster generates entities on the fly, so you don't have to worry about their creation. However, you must define them in order to instruct Booster how to generate them."),(0,i.kt)("admonition",{type:"info"},(0,i.kt)("p",{parentName:"admonition"},"Under the hood, Booster stores snapshots of the entities in order to reduce the load on the event store. That way, Booster doesn't have to reduce the whole event stream whenever the current state of an entity is needed.")),(0,i.kt)("h2",{id:"creating-entities"},"Creating entities"),(0,i.kt)("p",null,"The Booster CLI will help you to create new entities. You just need to run the following command and the CLI will generate all the boilerplate for you:"),(0,i.kt)(r.Z,{mdxType:"TerminalWindow"},(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-shell"},"boost new:entity Product --fields displayName:string description:string price:Money\n"))),(0,i.kt)("p",null,"This will generate a new file called ",(0,i.kt)("inlineCode",{parentName:"p"},"product.ts")," in the ",(0,i.kt)("inlineCode",{parentName:"p"},"src/entities")," directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI."),(0,i.kt)("h2",{id:"declaring-an-entity"},"Declaring an entity"),(0,i.kt)("p",null,"To declare an entity in Booster, you must define a class decorated with the ",(0,i.kt)("inlineCode",{parentName:"p"},"@Entity")," decorator. Inside of the class, you must define a constructor with all the fields you want to have in your entity."),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/entities/entity-name.ts"',title:'"src/entities/entity-name.ts"'},"@Entity\nexport class EntityName {\n public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType /* as many fields as needed */) {}\n}\n")),(0,i.kt)("h2",{id:"the-reduce-function"},"The reduce function"),(0,i.kt)("p",null,"In order to tell Booster how to reduce the events, you must define a static method decorated with the ",(0,i.kt)("inlineCode",{parentName:"p"},"@Reduces")," decorator. This method will be called by the framework every time an event of the specified type is emitted. The reducer method must return a new entity instance with the current state of the entity."),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/entities/entity-name.ts"',title:'"src/entities/entity-name.ts"'},"@Entity\nexport class EntityName {\n public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType /* as many fields as needed */) {}\n\n // highlight-start\n @Reduces(SomeEvent)\n public static reduceSomeEvent(event: SomeEvent, currentEntityState?: EntityName): EntityName {\n /* Return a new entity based on the current one */\n }\n // highlight-end\n}\n")),(0,i.kt)("p",null,"The reducer method receives two parameters:"),(0,i.kt)("ul",null,(0,i.kt)("li",{parentName:"ul"},(0,i.kt)("inlineCode",{parentName:"li"},"event")," - The event object that triggered the reducer"),(0,i.kt)("li",{parentName:"ul"},(0,i.kt)("inlineCode",{parentName:"li"},"currentEntity?")," - The current state of the entity instance that the event belongs to if it exists. ",(0,i.kt)("strong",{parentName:"li"},"This parameter is optional")," and will be ",(0,i.kt)("inlineCode",{parentName:"li"},"undefined")," if the entity doesn't exist yet (For example, when you process a ",(0,i.kt)("inlineCode",{parentName:"li"},"ProductCreated")," event that will generate the first version of a ",(0,i.kt)("inlineCode",{parentName:"li"},"Product")," entity).")),(0,i.kt)("h3",{id:"reducing-multiple-events"},"Reducing multiple events"),(0,i.kt)("p",null,"You can define as many reducer methods as you want, each one for a different event type. For example, if you have a ",(0,i.kt)("inlineCode",{parentName:"p"},"Cart")," entity, you could define a reducer for ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductAdded")," events and another one for ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductRemoved")," events."),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/entities/cart.ts"',title:'"src/entities/cart.ts"'},"@Entity\nexport class Cart {\n public constructor(readonly items: Array) {}\n\n @Reduces(ProductAdded)\n public static reduceProductAdded(event: ProductAdded, currentCart?: Cart): Cart {\n const newItems = addToCart(event.item, currentCart)\n return new Cart(newItems)\n }\n\n @Reduces(ProductRemoved)\n public static reduceProductRemoved(event: ProductRemoved, currentCart?: Cart): Cart {\n const newItems = removeFromCart(event.item, currentCart)\n return new Cart(newItems)\n }\n}\n")),(0,i.kt)("admonition",{type:"tip"},(0,i.kt)("p",{parentName:"admonition"},"It's highly recommended to ",(0,i.kt)("strong",{parentName:"p"},"keep your reducer functions pure"),", which means that you should be able to produce the new entity version by just looking at the event and the current entity state. You should avoid calling third party services, reading or writing to a database, or changing any external state.")),(0,i.kt)("p",null,"There could be a lot of events being reduced concurrently among many entities, but, ",(0,i.kt)("strong",{parentName:"p"},"for a specific entity instance, the events order is preserved"),". This means that while one event is being reduced, all other events of any kind ",(0,i.kt)("em",{parentName:"p"},"that belong to the same entity instance")," will be waiting in a queue until the previous reducer has finished. This is how Booster guarantees that the entity state is consistent."),(0,i.kt)("p",null,(0,i.kt)("img",{alt:"reducer process gif",src:n(5876).Z,width:"1208",height:"638"})),(0,i.kt)("h3",{id:"eventual-consistency"},"Eventual Consistency"),(0,i.kt)("p",null,"Additionally, due to the event driven and async nature of Booster, your data might not be instantly updated. Booster will consume the commands, generate events, and ",(0,i.kt)("em",{parentName:"p"},"eventually")," generate the entities. Most of the time this is not perceivable, but under huge loads, it could be noticed."),(0,i.kt)("p",null,"This property is called ",(0,i.kt)("a",{parentName:"p",href:"https://en.wikipedia.org/wiki/Eventual_consistency"},"Eventual Consistency"),", and it is a trade-off to have high availability for extreme situations, where other systems might simply fail."),(0,i.kt)("h2",{id:"entity-id"},"Entity ID"),(0,i.kt)("p",null,"In order to identify each entity instance, you must define an ",(0,i.kt)("inlineCode",{parentName:"p"},"id")," field on each entity. This field will be used by the framework to identify the entity instance. If the value of the ",(0,i.kt)("inlineCode",{parentName:"p"},"id")," field matches the value returned by the ",(0,i.kt)("a",{parentName:"p",href:"event#events-and-entities"},(0,i.kt)("inlineCode",{parentName:"a"},"entityID()")," method")," of an Event, the framework will consider that the event belongs to that entity instance."),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/entities/entity-name.ts"',title:'"src/entities/entity-name.ts"'},"@Entity\nexport class EntityName {\n public constructor(\n // highlight-next-line\n readonly id: UUID,\n readonly fieldA: SomeType,\n readonly fieldB: SomeOtherType /* as many fields as needed */\n ) {}\n\n @Reduces(SomeEvent)\n public static reduceSomeEvent(event: SomeEvent, currentEntityState?: EntityName): EntityName {\n /* Return a new entity based on the current one */\n }\n}\n")),(0,i.kt)("admonition",{type:"tip"},(0,i.kt)("p",{parentName:"admonition"},"We recommend you to use the ",(0,i.kt)("inlineCode",{parentName:"p"},"UUID")," type for the ",(0,i.kt)("inlineCode",{parentName:"p"},"id")," field. You can generate a new ",(0,i.kt)("inlineCode",{parentName:"p"},"UUID")," value by calling the ",(0,i.kt)("inlineCode",{parentName:"p"},"UUID.generate()")," method already provided by the framework.")),(0,i.kt)("h2",{id:"entities-naming-convention"},"Entities naming convention"),(0,i.kt)("p",null,"Entities are a representation of your application state in a specific moment, so name them as closely to your domain objects as possible. Typical entity names are nouns that might appear when you think about your app. In an e-commerce application, some entities would be:"),(0,i.kt)("ul",null,(0,i.kt)("li",{parentName:"ul"},"Cart"),(0,i.kt)("li",{parentName:"ul"},"Product"),(0,i.kt)("li",{parentName:"ul"},"UserProfile"),(0,i.kt)("li",{parentName:"ul"},"Order"),(0,i.kt)("li",{parentName:"ul"},"Address"),(0,i.kt)("li",{parentName:"ul"},"PaymentMethod"),(0,i.kt)("li",{parentName:"ul"},"Stock")),(0,i.kt)("p",null,"Entities live within the entities directory of the project source: ",(0,i.kt)("inlineCode",{parentName:"p"},"/src/entities"),"."),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-text"},"\n\u251c\u2500\u2500 src\n\u2502 \u251c\u2500\u2500 commands\n\u2502 \u251c\u2500\u2500 common\n\u2502 \u251c\u2500\u2500 config\n\u2502 \u251c\u2500\u2500 entities <------ put them here\n\u2502 \u251c\u2500\u2500 events\n\u2502 \u251c\u2500\u2500 index.ts\n\u2502 \u2514\u2500\u2500 read-models\n")))}p.isMDXComponent=!0},5876:(e,t,n)=>{n.d(t,{Z:()=>a});const a=n.p+"assets/images/reducer-faf967cd976ea38d84e14551aa3af383.gif"}}]); \ No newline at end of file diff --git a/assets/js/1efc9436.f2e51a64.js b/assets/js/1efc9436.f2e51a64.js new file mode 100644 index 000000000..abb085718 --- /dev/null +++ b/assets/js/1efc9436.f2e51a64.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[2126],{3905:(e,t,n)=>{n.d(t,{Zo:()=>d,kt:()=>m});var a=n(7294);function i(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function r(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);t&&(a=a.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,a)}return n}function o(e){for(var t=1;t=0||(i[n]=e[n]);return i}(e,t);if(Object.getOwnPropertySymbols){var r=Object.getOwnPropertySymbols(e);for(a=0;a=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(i[n]=e[n])}return i}var s=a.createContext({}),c=function(e){var t=a.useContext(s),n=t;return e&&(n="function"==typeof e?e(t):o(o({},t),e)),n},d=function(e){var t=c(e.components);return a.createElement(s.Provider,{value:t},e.children)},u={inlineCode:"code",wrapper:function(e){var t=e.children;return a.createElement(a.Fragment,{},t)}},p=a.forwardRef((function(e,t){var n=e.components,i=e.mdxType,r=e.originalType,s=e.parentName,d=l(e,["components","mdxType","originalType","parentName"]),p=c(n),m=i,h=p["".concat(s,".").concat(m)]||p[m]||u[m]||r;return n?a.createElement(h,o(o({ref:t},d),{},{components:n})):a.createElement(h,o({ref:t},d))}));function m(e,t){var n=arguments,i=t&&t.mdxType;if("string"==typeof e||i){var r=n.length,o=new Array(r);o[0]=p;var l={};for(var s in t)hasOwnProperty.call(t,s)&&(l[s]=t[s]);l.originalType=e,l.mdxType="string"==typeof e?e:i,o[1]=l;for(var c=2;c{n.d(t,{Z:()=>c});var a=n(7294);const i="terminalWindow_wGrl",r="terminalWindowHeader_o9Cs",o="buttons_IGLB",l="dot_fGZE",s="terminalWindowBody_tzdS";function c(e){let{children:t}=e;return a.createElement("div",{className:i},a.createElement("div",{className:r},a.createElement("div",{className:o},a.createElement("span",{className:l,style:{background:"#f25f58"}}),a.createElement("span",{className:l,style:{background:"#fbbe3c"}}),a.createElement("span",{className:l,style:{background:"#58cb42"}}))),a.createElement("div",{className:s},t))}},9139:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>c,contentTitle:()=>l,default:()=>p,frontMatter:()=>o,metadata:()=>s,toc:()=>d});var a=n(7462),i=(n(7294),n(3905)),r=n(5163);const o={},l="Entity",s={unversionedId:"architecture/entity",id:"architecture/entity",title:"Entity",description:"If events are the source of truth of your application, entities are the current state of your application. For example, if you have an application that allows users to create bank accounts, the events would be something like AccountCreated, MoneyDeposited, MoneyWithdrawn, etc. But the entities would be the BankAccount themselves, with the current balance, owner, etc.",source:"@site/docs/03_architecture/05_entity.mdx",sourceDirName:"03_architecture",slug:"/architecture/entity",permalink:"/architecture/entity",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/03_architecture/05_entity.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",sidebarPosition:5,frontMatter:{},sidebar:"docs",previous:{title:"Event handler",permalink:"/architecture/event-handler"},next:{title:"Read model",permalink:"/architecture/read-model"}},c={},d=[{value:"Creating entities",id:"creating-entities",level:2},{value:"Declaring an entity",id:"declaring-an-entity",level:2},{value:"The reduce function",id:"the-reduce-function",level:2},{value:"Reducing multiple events",id:"reducing-multiple-events",level:3},{value:"Eventual Consistency",id:"eventual-consistency",level:3},{value:"Entity ID",id:"entity-id",level:2},{value:"Entities naming convention",id:"entities-naming-convention",level:2}],u={toc:d};function p(e){let{components:t,...o}=e;return(0,i.kt)("wrapper",(0,a.Z)({},u,o,{components:t,mdxType:"MDXLayout"}),(0,i.kt)("h1",{id:"entity"},"Entity"),(0,i.kt)("p",null,"If events are the ",(0,i.kt)("em",{parentName:"p"},"source of truth")," of your application, entities are the ",(0,i.kt)("em",{parentName:"p"},"current state")," of your application. For example, if you have an application that allows users to create bank accounts, the events would be something like ",(0,i.kt)("inlineCode",{parentName:"p"},"AccountCreated"),", ",(0,i.kt)("inlineCode",{parentName:"p"},"MoneyDeposited"),", ",(0,i.kt)("inlineCode",{parentName:"p"},"MoneyWithdrawn"),", etc. But the entities would be the ",(0,i.kt)("inlineCode",{parentName:"p"},"BankAccount")," themselves, with the current balance, owner, etc."),(0,i.kt)("p",null,"Entities are created by ",(0,i.kt)("em",{parentName:"p"},"reducing")," the whole event stream. Booster generates entities on the fly, so you don't have to worry about their creation. However, you must define them in order to instruct Booster how to generate them."),(0,i.kt)("admonition",{type:"info"},(0,i.kt)("p",{parentName:"admonition"},"Under the hood, Booster stores snapshots of the entities in order to reduce the load on the event store. That way, Booster doesn't have to reduce the whole event stream whenever the current state of an entity is needed.")),(0,i.kt)("h2",{id:"creating-entities"},"Creating entities"),(0,i.kt)("p",null,"The Booster CLI will help you to create new entities. You just need to run the following command and the CLI will generate all the boilerplate for you:"),(0,i.kt)(r.Z,{mdxType:"TerminalWindow"},(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-shell"},"boost new:entity Product --fields displayName:string description:string price:Money\n"))),(0,i.kt)("p",null,"This will generate a new file called ",(0,i.kt)("inlineCode",{parentName:"p"},"product.ts")," in the ",(0,i.kt)("inlineCode",{parentName:"p"},"src/entities")," directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI."),(0,i.kt)("h2",{id:"declaring-an-entity"},"Declaring an entity"),(0,i.kt)("p",null,"To declare an entity in Booster, you must define a class decorated with the ",(0,i.kt)("inlineCode",{parentName:"p"},"@Entity")," decorator. Inside of the class, you must define a constructor with all the fields you want to have in your entity."),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/entities/entity-name.ts"',title:'"src/entities/entity-name.ts"'},"@Entity\nexport class EntityName {\n public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType /* as many fields as needed */) {}\n}\n")),(0,i.kt)("h2",{id:"the-reduce-function"},"The reduce function"),(0,i.kt)("p",null,"In order to tell Booster how to reduce the events, you must define a static method decorated with the ",(0,i.kt)("inlineCode",{parentName:"p"},"@Reduces")," decorator. This method will be called by the framework every time an event of the specified type is emitted. The reducer method must return a new entity instance with the current state of the entity."),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/entities/entity-name.ts"',title:'"src/entities/entity-name.ts"'},"@Entity\nexport class EntityName {\n public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType /* as many fields as needed */) {}\n\n // highlight-start\n @Reduces(SomeEvent)\n public static reduceSomeEvent(event: SomeEvent, currentEntityState?: EntityName): EntityName {\n /* Return a new entity based on the current one */\n }\n // highlight-end\n}\n")),(0,i.kt)("p",null,"The reducer method receives two parameters:"),(0,i.kt)("ul",null,(0,i.kt)("li",{parentName:"ul"},(0,i.kt)("inlineCode",{parentName:"li"},"event")," - The event object that triggered the reducer"),(0,i.kt)("li",{parentName:"ul"},(0,i.kt)("inlineCode",{parentName:"li"},"currentEntity?")," - The current state of the entity instance that the event belongs to if it exists. ",(0,i.kt)("strong",{parentName:"li"},"This parameter is optional")," and will be ",(0,i.kt)("inlineCode",{parentName:"li"},"undefined")," if the entity doesn't exist yet (For example, when you process a ",(0,i.kt)("inlineCode",{parentName:"li"},"ProductCreated")," event that will generate the first version of a ",(0,i.kt)("inlineCode",{parentName:"li"},"Product")," entity).")),(0,i.kt)("h3",{id:"reducing-multiple-events"},"Reducing multiple events"),(0,i.kt)("p",null,"You can define as many reducer methods as you want, each one for a different event type. For example, if you have a ",(0,i.kt)("inlineCode",{parentName:"p"},"Cart")," entity, you could define a reducer for ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductAdded")," events and another one for ",(0,i.kt)("inlineCode",{parentName:"p"},"ProductRemoved")," events."),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/entities/cart.ts"',title:'"src/entities/cart.ts"'},"@Entity\nexport class Cart {\n public constructor(readonly items: Array) {}\n\n @Reduces(ProductAdded)\n public static reduceProductAdded(event: ProductAdded, currentCart?: Cart): Cart {\n const newItems = addToCart(event.item, currentCart)\n return new Cart(newItems)\n }\n\n @Reduces(ProductRemoved)\n public static reduceProductRemoved(event: ProductRemoved, currentCart?: Cart): Cart {\n const newItems = removeFromCart(event.item, currentCart)\n return new Cart(newItems)\n }\n}\n")),(0,i.kt)("admonition",{type:"tip"},(0,i.kt)("p",{parentName:"admonition"},"It's highly recommended to ",(0,i.kt)("strong",{parentName:"p"},"keep your reducer functions pure"),", which means that you should be able to produce the new entity version by just looking at the event and the current entity state. You should avoid calling third party services, reading or writing to a database, or changing any external state.")),(0,i.kt)("p",null,"There could be a lot of events being reduced concurrently among many entities, but, ",(0,i.kt)("strong",{parentName:"p"},"for a specific entity instance, the events order is preserved"),". This means that while one event is being reduced, all other events of any kind ",(0,i.kt)("em",{parentName:"p"},"that belong to the same entity instance")," will be waiting in a queue until the previous reducer has finished. This is how Booster guarantees that the entity state is consistent."),(0,i.kt)("p",null,(0,i.kt)("img",{alt:"reducer process gif",src:n(5876).Z,width:"1208",height:"638"})),(0,i.kt)("h3",{id:"eventual-consistency"},"Eventual Consistency"),(0,i.kt)("p",null,"Additionally, due to the event driven and async nature of Booster, your data might not be instantly updated. Booster will consume the commands, generate events, and ",(0,i.kt)("em",{parentName:"p"},"eventually")," generate the entities. Most of the time this is not perceivable, but under huge loads, it could be noticed."),(0,i.kt)("p",null,"This property is called ",(0,i.kt)("a",{parentName:"p",href:"https://en.wikipedia.org/wiki/Eventual_consistency"},"Eventual Consistency"),", and it is a trade-off to have high availability for extreme situations, where other systems might simply fail."),(0,i.kt)("h2",{id:"entity-id"},"Entity ID"),(0,i.kt)("p",null,"In order to identify each entity instance, you must define an ",(0,i.kt)("inlineCode",{parentName:"p"},"id")," field on each entity. This field will be used by the framework to identify the entity instance. If the value of the ",(0,i.kt)("inlineCode",{parentName:"p"},"id")," field matches the value returned by the ",(0,i.kt)("a",{parentName:"p",href:"event#events-and-entities"},(0,i.kt)("inlineCode",{parentName:"a"},"entityID()")," method")," of an Event, the framework will consider that the event belongs to that entity instance."),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/entities/entity-name.ts"',title:'"src/entities/entity-name.ts"'},"@Entity\nexport class EntityName {\n public constructor(\n // highlight-next-line\n readonly id: UUID,\n readonly fieldA: SomeType,\n readonly fieldB: SomeOtherType /* as many fields as needed */\n ) {}\n\n @Reduces(SomeEvent)\n public static reduceSomeEvent(event: SomeEvent, currentEntityState?: EntityName): EntityName {\n /* Return a new entity based on the current one */\n }\n}\n")),(0,i.kt)("admonition",{type:"tip"},(0,i.kt)("p",{parentName:"admonition"},"We recommend you to use the ",(0,i.kt)("inlineCode",{parentName:"p"},"UUID")," type for the ",(0,i.kt)("inlineCode",{parentName:"p"},"id")," field. You can generate a new ",(0,i.kt)("inlineCode",{parentName:"p"},"UUID")," value by calling the ",(0,i.kt)("inlineCode",{parentName:"p"},"UUID.generate()")," method already provided by the framework.")),(0,i.kt)("h2",{id:"entities-naming-convention"},"Entities naming convention"),(0,i.kt)("p",null,"Entities are a representation of your application state in a specific moment, so name them as closely to your domain objects as possible. Typical entity names are nouns that might appear when you think about your app. In an e-commerce application, some entities would be:"),(0,i.kt)("ul",null,(0,i.kt)("li",{parentName:"ul"},"Cart"),(0,i.kt)("li",{parentName:"ul"},"Product"),(0,i.kt)("li",{parentName:"ul"},"UserProfile"),(0,i.kt)("li",{parentName:"ul"},"Order"),(0,i.kt)("li",{parentName:"ul"},"Address"),(0,i.kt)("li",{parentName:"ul"},"PaymentMethod"),(0,i.kt)("li",{parentName:"ul"},"Stock")),(0,i.kt)("p",null,"Entities live within the entities directory of the project source: ",(0,i.kt)("inlineCode",{parentName:"p"},"/src/entities"),"."),(0,i.kt)("pre",null,(0,i.kt)("code",{parentName:"pre",className:"language-text"},"\n\u251c\u2500\u2500 src\n\u2502 \u251c\u2500\u2500 commands\n\u2502 \u251c\u2500\u2500 common\n\u2502 \u251c\u2500\u2500 config\n\u2502 \u251c\u2500\u2500 entities <------ put them here\n\u2502 \u251c\u2500\u2500 events\n\u2502 \u251c\u2500\u2500 index.ts\n\u2502 \u2514\u2500\u2500 read-models\n")))}p.isMDXComponent=!0},5876:(e,t,n)=>{n.d(t,{Z:()=>a});const a=n.p+"assets/images/reducer-faf967cd976ea38d84e14551aa3af383.gif"}}]); \ No newline at end of file diff --git a/assets/js/352d8b40.3128dcbb.js b/assets/js/352d8b40.3128dcbb.js deleted file mode 100644 index 3432b0233..000000000 --- a/assets/js/352d8b40.3128dcbb.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[6502],{3905:(t,e,n)=>{n.d(e,{Zo:()=>p,kt:()=>u});var i=n(7294);function a(t,e,n){return e in t?Object.defineProperty(t,e,{value:n,enumerable:!0,configurable:!0,writable:!0}):t[e]=n,t}function o(t,e){var n=Object.keys(t);if(Object.getOwnPropertySymbols){var i=Object.getOwnPropertySymbols(t);e&&(i=i.filter((function(e){return Object.getOwnPropertyDescriptor(t,e).enumerable}))),n.push.apply(n,i)}return n}function r(t){for(var e=1;e=0||(a[n]=t[n]);return a}(t,e);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(t);for(i=0;i=0||Object.prototype.propertyIsEnumerable.call(t,n)&&(a[n]=t[n])}return a}var s=i.createContext({}),l=function(t){var e=i.useContext(s),n=e;return t&&(n="function"==typeof t?t(e):r(r({},e),t)),n},p=function(t){var e=l(t.components);return i.createElement(s.Provider,{value:e},t.children)},d={inlineCode:"code",wrapper:function(t){var e=t.children;return i.createElement(i.Fragment,{},e)}},f=i.forwardRef((function(t,e){var n=t.components,a=t.mdxType,o=t.originalType,s=t.parentName,p=c(t,["components","mdxType","originalType","parentName"]),f=l(n),u=a,m=f["".concat(s,".").concat(u)]||f[u]||d[u]||o;return n?i.createElement(m,r(r({ref:e},p),{},{components:n})):i.createElement(m,r({ref:e},p))}));function u(t,e){var n=arguments,a=e&&e.mdxType;if("string"==typeof t||a){var o=n.length,r=new Array(o);r[0]=f;var c={};for(var s in e)hasOwnProperty.call(e,s)&&(c[s]=e[s]);c.originalType=t,c.mdxType="string"==typeof t?t:a,r[1]=c;for(var l=2;l{n.d(e,{Z:()=>l});var i=n(7294);const a="terminalWindow_wGrl",o="terminalWindowHeader_o9Cs",r="buttons_IGLB",c="dot_fGZE",s="terminalWindowBody_tzdS";function l(t){let{children:e}=t;return i.createElement("div",{className:a},i.createElement("div",{className:o},i.createElement("div",{className:r},i.createElement("span",{className:c,style:{background:"#f25f58"}}),i.createElement("span",{className:c,style:{background:"#fbbe3c"}}),i.createElement("span",{className:c,style:{background:"#58cb42"}}))),i.createElement("div",{className:s},e))}},9741:(t,e,n)=>{n.r(e),n.d(e,{assets:()=>l,contentTitle:()=>c,default:()=>f,frontMatter:()=>r,metadata:()=>s,toc:()=>p});var i=n(7462),a=(n(7294),n(3905)),o=n(5163);const r={description:"Documentation for defining notifications in the Booster Framework using the @Notification and @partitionKey decorators."},c="Notifications",s={unversionedId:"architecture/notifications",id:"architecture/notifications",title:"Notifications",description:"Documentation for defining notifications in the Booster Framework using the @Notification and @partitionKey decorators.",source:"@site/docs/03_architecture/07_notifications.mdx",sourceDirName:"03_architecture",slug:"/architecture/notifications",permalink:"/architecture/notifications",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/03_architecture/07_notifications.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",sidebarPosition:7,frontMatter:{description:"Documentation for defining notifications in the Booster Framework using the @Notification and @partitionKey decorators."},sidebar:"docs",previous:{title:"Read model",permalink:"/architecture/read-model"},next:{title:"Features",permalink:"/category/features"}},l={},p=[{value:"Declaring a notification",id:"declaring-a-notification",level:2},{value:"Separating by topic",id:"separating-by-topic",level:2},{value:"Separating by partition key",id:"separating-by-partition-key",level:2},{value:"Reacting to notifications",id:"reacting-to-notifications",level:2}],d={toc:p};function f(t){let{components:e,...n}=t;return(0,a.kt)("wrapper",(0,i.Z)({},d,n,{components:e,mdxType:"MDXLayout"}),(0,a.kt)("h1",{id:"notifications"},"Notifications"),(0,a.kt)("p",null,"Notifications are an important concept in event-driven architecture, and they play a crucial role in informing interested parties about certain events that take place within an application."),(0,a.kt)("h2",{id:"declaring-a-notification"},"Declaring a notification"),(0,a.kt)("p",null,"In Booster, notifications are defined as classes decorated with the ",(0,a.kt)("inlineCode",{parentName:"p"},"@Notification")," decorator. Here's a minimal example to illustrate this:"),(0,a.kt)(o.Z,{mdxType:"TerminalWindow"},(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/notifications/cart-abandoned.ts"',title:'"src/notifications/cart-abandoned.ts"'},"import { Notification } from '@boostercloud/framework-core'\n\n@Notification()\nexport class CartAbandoned {}\n"))),(0,a.kt)("p",null,"As you can see, to define a notification you simply need to import the ",(0,a.kt)("inlineCode",{parentName:"p"},"@Notification")," decorator from the @boostercloud/framework-core library and use it to decorate a class. In this case, the class ",(0,a.kt)("inlineCode",{parentName:"p"},"CartAbandoned")," represents a notification that informs interested parties that a cart has been abandoned."),(0,a.kt)("h2",{id:"separating-by-topic"},"Separating by topic"),(0,a.kt)("p",null,"By default, all notifications in the application will be sent to the same topic called ",(0,a.kt)("inlineCode",{parentName:"p"},"defaultTopic"),". To configure this, you can specify a different topic name in the ",(0,a.kt)("inlineCode",{parentName:"p"},"@Notification")," decorator:"),(0,a.kt)(o.Z,{mdxType:"TerminalWindow"},(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/notifications/cart-abandoned-topic.ts"',title:'"src/notifications/cart-abandoned-topic.ts"'},"import { Notification } from '@boostercloud/framework-core'\n\n@Notification({ topic: 'cart-abandoned' })\nexport class CartAbandoned {}\n"))),(0,a.kt)("p",null,"In this example, the ",(0,a.kt)("inlineCode",{parentName:"p"},"CartAbandoned")," notification will be sent to the ",(0,a.kt)("inlineCode",{parentName:"p"},"cart-abandoned")," topic, instead of the default topic."),(0,a.kt)("h2",{id:"separating-by-partition-key"},"Separating by partition key"),(0,a.kt)("p",null,"By default, all the notifications in the application will share a partition key called ",(0,a.kt)("inlineCode",{parentName:"p"},"default"),". This means that, by default, all the notifications in the application will be processed in order, which may not be as performant."),(0,a.kt)("p",null,"To change this, you can use the @partitionKey decorator to specify a field that will be used as a partition key for each notification:"),(0,a.kt)(o.Z,{mdxType:"TerminalWindow"},(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/notifications/cart-abandoned-partition-key.ts"',title:'"src/notifications/cart-abandoned-partition-key.ts"'},"import { Notification, partitionKey } from '@boostercloud/framework-core'\n\n@Notification({ topic: 'cart-abandoned' })\nexport class CartAbandoned {\n public constructor(@partitionKey readonly key: string) {}\n}\n"))),(0,a.kt)("p",null,"In this example, each ",(0,a.kt)("inlineCode",{parentName:"p"},"CartAbandoned")," notification will have its own partition key, which is specified in the constructor as the field ",(0,a.kt)("inlineCode",{parentName:"p"},"key"),", it can be called in any way you want. This will allow for parallel processing of notifications, making the system more performant."),(0,a.kt)("h2",{id:"reacting-to-notifications"},"Reacting to notifications"),(0,a.kt)("p",null,"Just like events, notifications can be handled by event handlers in order to trigger other processes. Event handlers are responsible for listening to events and notifications, and then performing specific actions in response to them."),(0,a.kt)("p",null,"In conclusion, defining notifications in the Booster Framework is a simple and straightforward process that can be done using the ",(0,a.kt)("inlineCode",{parentName:"p"},"@Notification")," and ",(0,a.kt)("inlineCode",{parentName:"p"},"@partitionKey")," decorators."))}f.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/352d8b40.a552d311.js b/assets/js/352d8b40.a552d311.js new file mode 100644 index 000000000..a2477c8d6 --- /dev/null +++ b/assets/js/352d8b40.a552d311.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[6502],{3905:(t,e,n)=>{n.d(e,{Zo:()=>p,kt:()=>u});var i=n(7294);function a(t,e,n){return e in t?Object.defineProperty(t,e,{value:n,enumerable:!0,configurable:!0,writable:!0}):t[e]=n,t}function o(t,e){var n=Object.keys(t);if(Object.getOwnPropertySymbols){var i=Object.getOwnPropertySymbols(t);e&&(i=i.filter((function(e){return Object.getOwnPropertyDescriptor(t,e).enumerable}))),n.push.apply(n,i)}return n}function r(t){for(var e=1;e=0||(a[n]=t[n]);return a}(t,e);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(t);for(i=0;i=0||Object.prototype.propertyIsEnumerable.call(t,n)&&(a[n]=t[n])}return a}var s=i.createContext({}),l=function(t){var e=i.useContext(s),n=e;return t&&(n="function"==typeof t?t(e):r(r({},e),t)),n},p=function(t){var e=l(t.components);return i.createElement(s.Provider,{value:e},t.children)},d={inlineCode:"code",wrapper:function(t){var e=t.children;return i.createElement(i.Fragment,{},e)}},f=i.forwardRef((function(t,e){var n=t.components,a=t.mdxType,o=t.originalType,s=t.parentName,p=c(t,["components","mdxType","originalType","parentName"]),f=l(n),u=a,m=f["".concat(s,".").concat(u)]||f[u]||d[u]||o;return n?i.createElement(m,r(r({ref:e},p),{},{components:n})):i.createElement(m,r({ref:e},p))}));function u(t,e){var n=arguments,a=e&&e.mdxType;if("string"==typeof t||a){var o=n.length,r=new Array(o);r[0]=f;var c={};for(var s in e)hasOwnProperty.call(e,s)&&(c[s]=e[s]);c.originalType=t,c.mdxType="string"==typeof t?t:a,r[1]=c;for(var l=2;l{n.d(e,{Z:()=>l});var i=n(7294);const a="terminalWindow_wGrl",o="terminalWindowHeader_o9Cs",r="buttons_IGLB",c="dot_fGZE",s="terminalWindowBody_tzdS";function l(t){let{children:e}=t;return i.createElement("div",{className:a},i.createElement("div",{className:o},i.createElement("div",{className:r},i.createElement("span",{className:c,style:{background:"#f25f58"}}),i.createElement("span",{className:c,style:{background:"#fbbe3c"}}),i.createElement("span",{className:c,style:{background:"#58cb42"}}))),i.createElement("div",{className:s},e))}},9741:(t,e,n)=>{n.r(e),n.d(e,{assets:()=>l,contentTitle:()=>c,default:()=>f,frontMatter:()=>r,metadata:()=>s,toc:()=>p});var i=n(7462),a=(n(7294),n(3905)),o=n(5163);const r={description:"Documentation for defining notifications in the Booster Framework using the @Notification and @partitionKey decorators."},c="Notifications",s={unversionedId:"architecture/notifications",id:"architecture/notifications",title:"Notifications",description:"Documentation for defining notifications in the Booster Framework using the @Notification and @partitionKey decorators.",source:"@site/docs/03_architecture/07_notifications.mdx",sourceDirName:"03_architecture",slug:"/architecture/notifications",permalink:"/architecture/notifications",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/03_architecture/07_notifications.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",sidebarPosition:7,frontMatter:{description:"Documentation for defining notifications in the Booster Framework using the @Notification and @partitionKey decorators."},sidebar:"docs",previous:{title:"Read model",permalink:"/architecture/read-model"},next:{title:"Features",permalink:"/category/features"}},l={},p=[{value:"Declaring a notification",id:"declaring-a-notification",level:2},{value:"Separating by topic",id:"separating-by-topic",level:2},{value:"Separating by partition key",id:"separating-by-partition-key",level:2},{value:"Reacting to notifications",id:"reacting-to-notifications",level:2}],d={toc:p};function f(t){let{components:e,...n}=t;return(0,a.kt)("wrapper",(0,i.Z)({},d,n,{components:e,mdxType:"MDXLayout"}),(0,a.kt)("h1",{id:"notifications"},"Notifications"),(0,a.kt)("p",null,"Notifications are an important concept in event-driven architecture, and they play a crucial role in informing interested parties about certain events that take place within an application."),(0,a.kt)("h2",{id:"declaring-a-notification"},"Declaring a notification"),(0,a.kt)("p",null,"In Booster, notifications are defined as classes decorated with the ",(0,a.kt)("inlineCode",{parentName:"p"},"@Notification")," decorator. Here's a minimal example to illustrate this:"),(0,a.kt)(o.Z,{mdxType:"TerminalWindow"},(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/notifications/cart-abandoned.ts"',title:'"src/notifications/cart-abandoned.ts"'},"import { Notification } from '@boostercloud/framework-core'\n\n@Notification()\nexport class CartAbandoned {}\n"))),(0,a.kt)("p",null,"As you can see, to define a notification you simply need to import the ",(0,a.kt)("inlineCode",{parentName:"p"},"@Notification")," decorator from the @boostercloud/framework-core library and use it to decorate a class. In this case, the class ",(0,a.kt)("inlineCode",{parentName:"p"},"CartAbandoned")," represents a notification that informs interested parties that a cart has been abandoned."),(0,a.kt)("h2",{id:"separating-by-topic"},"Separating by topic"),(0,a.kt)("p",null,"By default, all notifications in the application will be sent to the same topic called ",(0,a.kt)("inlineCode",{parentName:"p"},"defaultTopic"),". To configure this, you can specify a different topic name in the ",(0,a.kt)("inlineCode",{parentName:"p"},"@Notification")," decorator:"),(0,a.kt)(o.Z,{mdxType:"TerminalWindow"},(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/notifications/cart-abandoned-topic.ts"',title:'"src/notifications/cart-abandoned-topic.ts"'},"import { Notification } from '@boostercloud/framework-core'\n\n@Notification({ topic: 'cart-abandoned' })\nexport class CartAbandoned {}\n"))),(0,a.kt)("p",null,"In this example, the ",(0,a.kt)("inlineCode",{parentName:"p"},"CartAbandoned")," notification will be sent to the ",(0,a.kt)("inlineCode",{parentName:"p"},"cart-abandoned")," topic, instead of the default topic."),(0,a.kt)("h2",{id:"separating-by-partition-key"},"Separating by partition key"),(0,a.kt)("p",null,"By default, all the notifications in the application will share a partition key called ",(0,a.kt)("inlineCode",{parentName:"p"},"default"),". This means that, by default, all the notifications in the application will be processed in order, which may not be as performant."),(0,a.kt)("p",null,"To change this, you can use the @partitionKey decorator to specify a field that will be used as a partition key for each notification:"),(0,a.kt)(o.Z,{mdxType:"TerminalWindow"},(0,a.kt)("pre",null,(0,a.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/notifications/cart-abandoned-partition-key.ts"',title:'"src/notifications/cart-abandoned-partition-key.ts"'},"import { Notification, partitionKey } from '@boostercloud/framework-core'\n\n@Notification({ topic: 'cart-abandoned' })\nexport class CartAbandoned {\n public constructor(@partitionKey readonly key: string) {}\n}\n"))),(0,a.kt)("p",null,"In this example, each ",(0,a.kt)("inlineCode",{parentName:"p"},"CartAbandoned")," notification will have its own partition key, which is specified in the constructor as the field ",(0,a.kt)("inlineCode",{parentName:"p"},"key"),", it can be called in any way you want. This will allow for parallel processing of notifications, making the system more performant."),(0,a.kt)("h2",{id:"reacting-to-notifications"},"Reacting to notifications"),(0,a.kt)("p",null,"Just like events, notifications can be handled by event handlers in order to trigger other processes. Event handlers are responsible for listening to events and notifications, and then performing specific actions in response to them."),(0,a.kt)("p",null,"In conclusion, defining notifications in the Booster Framework is a simple and straightforward process that can be done using the ",(0,a.kt)("inlineCode",{parentName:"p"},"@Notification")," and ",(0,a.kt)("inlineCode",{parentName:"p"},"@partitionKey")," decorators."))}f.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/3c6e0dde.3d268a55.js b/assets/js/3c6e0dde.3d268a55.js deleted file mode 100644 index e1afd9d35..000000000 --- a/assets/js/3c6e0dde.3d268a55.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[4454],{3905:(e,t,n)=>{n.d(t,{Zo:()=>d,kt:()=>m});var a=n(7294);function o(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function r(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);t&&(a=a.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,a)}return n}function i(e){for(var t=1;t=0||(o[n]=e[n]);return o}(e,t);if(Object.getOwnPropertySymbols){var r=Object.getOwnPropertySymbols(e);for(a=0;a=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(o[n]=e[n])}return o}var s=a.createContext({}),p=function(e){var t=a.useContext(s),n=t;return e&&(n="function"==typeof e?e(t):i(i({},t),e)),n},d=function(e){var t=p(e.components);return a.createElement(s.Provider,{value:t},e.children)},c={inlineCode:"code",wrapper:function(e){var t=e.children;return a.createElement(a.Fragment,{},t)}},u=a.forwardRef((function(e,t){var n=e.components,o=e.mdxType,r=e.originalType,s=e.parentName,d=l(e,["components","mdxType","originalType","parentName"]),u=p(n),m=o,h=u["".concat(s,".").concat(m)]||u[m]||c[m]||r;return n?a.createElement(h,i(i({ref:t},d),{},{components:n})):a.createElement(h,i({ref:t},d))}));function m(e,t){var n=arguments,o=t&&t.mdxType;if("string"==typeof e||o){var r=n.length,i=new Array(r);i[0]=u;var l={};for(var s in t)hasOwnProperty.call(t,s)&&(l[s]=t[s]);l.originalType=e,l.mdxType="string"==typeof e?e:o,i[1]=l;for(var p=2;p{n.d(t,{do:()=>l,dM:()=>i,Dh:()=>s});var a=n(7294),o=n(719);const r=e=>{let{href:t,onClick:n,children:o}=e;return a.createElement("a",{href:t,target:"_blank",rel:"noopener noreferrer",onClick:e=>{n&&n()}},o)},i=e=>{let{children:t}=e;return p(t,"YY7T3ZSZ")},l=e=>{let{children:t}=e;return p(t,"NE1EADCK")},s=e=>{let{children:t}=e;return p(t,"AXTW7ICE")};function p(e,t){const{text:n,href:i}=function(e){if(a.isValidElement(e)&&e.props.href)return{text:e.props.children,href:e.props.href};return{text:"",href:""}}(e);return a.createElement(r,{href:i,onClick:()=>o.R.startAndTrackEvent(t)},n)}},5163:(e,t,n)=>{n.d(t,{Z:()=>p});var a=n(7294);const o="terminalWindow_wGrl",r="terminalWindowHeader_o9Cs",i="buttons_IGLB",l="dot_fGZE",s="terminalWindowBody_tzdS";function p(e){let{children:t}=e;return a.createElement("div",{className:o},a.createElement("div",{className:r},a.createElement("div",{className:i},a.createElement("span",{className:l,style:{background:"#f25f58"}}),a.createElement("span",{className:l,style:{background:"#fbbe3c"}}),a.createElement("span",{className:l,style:{background:"#58cb42"}}))),a.createElement("div",{className:s},t))}},4792:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>d,contentTitle:()=>s,default:()=>m,frontMatter:()=>l,metadata:()=>p,toc:()=>c});var a=n(7462),o=(n(7294),n(3905)),r=n(5163),i=n(2735);const l={description:"How to have the backend up and running for a blog application in a few minutes"},s="Build a Booster app in minutes",p={unversionedId:"getting-started/coding",id:"getting-started/coding",title:"Build a Booster app in minutes",description:"How to have the backend up and running for a blog application in a few minutes",source:"@site/docs/02_getting-started/coding.mdx",sourceDirName:"02_getting-started",slug:"/getting-started/coding",permalink:"/getting-started/coding",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/02_getting-started/coding.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",frontMatter:{description:"How to have the backend up and running for a blog application in a few minutes"},sidebar:"docs",previous:{title:"Installation",permalink:"/getting-started/installation"},next:{title:"Booster architecture",permalink:"/architecture/event-driven"}},d={},c=[{value:"1. Create the project",id:"1-create-the-project",level:3},{value:"2. First command",id:"2-first-command",level:3},{value:"3. First event",id:"3-first-event",level:3},{value:"4. First entity",id:"4-first-entity",level:3},{value:"5. First read model",id:"5-first-read-model",level:3},{value:"6. Deployment",id:"6-deployment",level:3},{value:"6.1 Running your application locally",id:"61-running-your-application-locally",level:4},{value:"6.2 Deploying to the cloud",id:"62-deploying-to-the-cloud",level:4},{value:"7. Testing",id:"7-testing",level:3},{value:"7.1 Creating posts",id:"71-creating-posts",level:4},{value:"7.2 Retrieving all posts",id:"72-retrieving-all-posts",level:4},{value:"7.3 Retrieving specific post",id:"73-retrieving-specific-post",level:4},{value:"8. Removing the stack",id:"8-removing-the-stack",level:3},{value:"9. More functionalities",id:"9-more-functionalities",level:3},{value:"Examples and walkthroughs",id:"examples-and-walkthroughs",level:2},{value:"Creation of a question-asking application backend",id:"creation-of-a-question-asking-application-backend",level:3},{value:"All the guides and examples",id:"all-the-guides-and-examples",level:3}],u={toc:c};function m(e){let{components:t,...l}=e;return(0,o.kt)("wrapper",(0,a.Z)({},u,l,{components:t,mdxType:"MDXLayout"}),(0,o.kt)("h1",{id:"build-a-booster-app-in-minutes"},"Build a Booster app in minutes"),(0,o.kt)("p",null,"In this section, we will go through all the necessary steps to have the backend up and\nrunning for a blog application in just a few minutes."),(0,o.kt)("p",null,"Before starting, make sure to ",(0,o.kt)("a",{parentName:"p",href:"/getting-started/installation"},"have Booster CLI installed"),". If you also want to deploy your application to your cloud provider, check out the ",(0,o.kt)("a",{parentName:"p",href:"../going-deeper/infrastructure-providers"},"Provider configuration")," section."),(0,o.kt)("h3",{id:"1-create-the-project"},"1. Create the project"),(0,o.kt)("p",null,"First of all, we will use the Booster CLI tool generators to create a project."),(0,o.kt)("p",null,"In your favourite terminal, run this command ",(0,o.kt)("inlineCode",{parentName:"p"},"boost new:project boosted-blog")," and follow\nthe instructions. After some prompted questions, the CLI will ask you to select one of the available providers to set up as the main provider that will be used."),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-shell"},"? What's the package name of your provider infrastructure library? (Use arrow keys)\n\u276f @boostercloud/framework-provider-aws (AWS)\n @boostercloud/framework-provider-azure (Azure)\n Other\n"))),(0,o.kt)("p",null,"When asked for the provider, select AWS as that is what we have\nconfigured ",(0,o.kt)("a",{parentName:"p",href:"../going-deeper/infrastructure-providers#aws-provider-setup"},"here")," for the example. You can use another provider if you want, or add more providers once you have created the project."),(0,o.kt)("p",null,"If you don't know what provider you are going to use, and you just want to execute your Booster application locally, you can select one and change it later!"),(0,o.kt)("p",null,"After choosing your provider, you will see your project generated!:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-shell"},"> boost new:project boosted-blog\n\n...\n\n\u2139 boost new \ud83d\udea7\n\u2714 Creating project root\n\u2714 Generating config files\n\u2714 Installing dependencies\n\u2139 Project generated!\n"))),(0,o.kt)("admonition",{type:"tip"},(0,o.kt)("p",{parentName:"admonition"},"If you prefer to create the project with default parameters, you can run the command as ",(0,o.kt)("inlineCode",{parentName:"p"},"boost new:project booster-blog --default"),". The default\nparameters are as follows:"),(0,o.kt)("ul",{parentName:"admonition"},(0,o.kt)("li",{parentName:"ul"},'Project name: The one provided when running the command, in this case "booster-blog"'),(0,o.kt)("li",{parentName:"ul"},"Provider: AWS"),(0,o.kt)("li",{parentName:"ul"},'Description, author, homepage and repository: ""'),(0,o.kt)("li",{parentName:"ul"},"License: MIT"),(0,o.kt)("li",{parentName:"ul"},"Version: 0.1.0"))),(0,o.kt)("p",null,"In case you want to specify each parameter without following the instructions, you can use the following flags with this structure ",(0,o.kt)("inlineCode",{parentName:"p"},"="),"."),(0,o.kt)("table",null,(0,o.kt)("thead",{parentName:"table"},(0,o.kt)("tr",{parentName:"thead"},(0,o.kt)("th",{parentName:"tr",align:"left"},"Flag"),(0,o.kt)("th",{parentName:"tr",align:"left"},"Short version"),(0,o.kt)("th",{parentName:"tr",align:"left"},"Description"))),(0,o.kt)("tbody",{parentName:"table"},(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--homepage")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-H")),(0,o.kt)("td",{parentName:"tr",align:"left"},"The website of this project")),(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--author")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-a")),(0,o.kt)("td",{parentName:"tr",align:"left"},"Author of this project")),(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--description")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-d")),(0,o.kt)("td",{parentName:"tr",align:"left"},"A short description")),(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--license")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-l")),(0,o.kt)("td",{parentName:"tr",align:"left"},"License used in this project")),(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--providerPackageName")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-p")),(0,o.kt)("td",{parentName:"tr",align:"left"},"Package name implementing the cloud provider integration where the application will be deployed")),(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--repository")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-r")),(0,o.kt)("td",{parentName:"tr",align:"left"},"The URL of the repository")),(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--version")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-v")),(0,o.kt)("td",{parentName:"tr",align:"left"},"The initial version")))),(0,o.kt)("p",null,"Additionally, you can use the ",(0,o.kt)("inlineCode",{parentName:"p"},"--skipInstall")," flag if you want to skip installing dependencies and the ",(0,o.kt)("inlineCode",{parentName:"p"},"--skipGit")," flag in case you want to skip git initialization."),(0,o.kt)("blockquote",null,(0,o.kt)("p",{parentName:"blockquote"},"Booster CLI commands follow this structure: ",(0,o.kt)("inlineCode",{parentName:"p"},"boost [] []"),".\nLet's break down the command we have just executed:"),(0,o.kt)("ul",{parentName:"blockquote"},(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"boost")," is the Booster CLI executable"),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"new:project"),' is the "subcommand" part. In this case, it is composed of two parts separated by a colon. The first part, ',(0,o.kt)("inlineCode",{parentName:"li"},"new"),", means that we want to generate a new resource. The second part, ",(0,o.kt)("inlineCode",{parentName:"li"},"project"),", indicates which kind of resource we are interested in. Other examples are ",(0,o.kt)("inlineCode",{parentName:"li"},"new:command"),", ",(0,o.kt)("inlineCode",{parentName:"li"},"new:event"),", etc. We'll see a bunch of them later."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"boosted-blog"),' is a "parameter" for the subcommand ',(0,o.kt)("inlineCode",{parentName:"li"},"new:project"),". Flags and parameters are optional and their meaning and shape depend on the subcommand you used. In this case, we are specifying the name of the project we are creating."))),(0,o.kt)("admonition",{type:"tip"},(0,o.kt)("p",{parentName:"admonition"},"You can always use the ",(0,o.kt)("inlineCode",{parentName:"p"},"--help")," flag to get all the available options for each cli command.")),(0,o.kt)("p",null,"When finished, you'll see some scaffolding that has been generated. The project name will be the\nproject's root so ",(0,o.kt)("inlineCode",{parentName:"p"},"cd")," into it:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-shell"},"cd boosted-blog\n"))),(0,o.kt)("p",null,"There you should have these files and directories already generated:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-text"},"boosted-blog\n\u251c\u2500\u2500 .eslintignore\n\u251c\u2500\u2500 .gitignore\n\u251c\u2500\u2500 .eslintrc.js\n\u251c\u2500\u2500 .prettierrc.yaml\n\u251c\u2500\u2500 package-lock.json\n\u251c\u2500\u2500 package.json\n\u251c\u2500\u2500 src\n\u2502 \u251c\u2500\u2500 commands\n\u2502 \u251c\u2500\u2500 common\n\u2502 \u251c\u2500\u2500 config\n\u2502 \u2502 \u2514\u2500\u2500 config.ts\n\u2502 \u251c\u2500\u2500 entities\n\u2502 \u251c\u2500\u2500 events\n\u2502 \u251c\u2500\u2500 event-handlers\n\u2502 \u251c\u2500\u2500 read-models\n\u2502 \u2514\u2500\u2500 index.ts\n\u251c\u2500\u2500 tsconfig.eslint.json\n\u2514\u2500\u2500 tsconfig.json\n")),(0,o.kt)("p",null,"Now open the project in your favorite editor, e.g. ",(0,o.kt)("a",{parentName:"p",href:"https://code.visualstudio.com/"},"Visual Studio Code"),"."),(0,o.kt)("h3",{id:"2-first-command"},"2. First command"),(0,o.kt)("p",null,"Commands define the input to our system, so we'll start by generating our first\n",(0,o.kt)("a",{parentName:"p",href:"/architecture/command"},"command")," to create posts. Use the command generator, while in the project's root\ndirectory, as follows:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost new:command CreatePost --fields postId:UUID title:string content:string author:string\n"))),(0,o.kt)("p",null,"The ",(0,o.kt)("inlineCode",{parentName:"p"},"new:command")," generator creates a ",(0,o.kt)("inlineCode",{parentName:"p"},"create-post.ts")," file in the ",(0,o.kt)("inlineCode",{parentName:"p"},"commands")," folder:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-text"},"boosted-blog\n\u2514\u2500\u2500 src\n \u2514\u2500\u2500 commands\n \u2514\u2500\u2500 create-post.ts\n")),(0,o.kt)("p",null,"As we mentioned before, commands are the input of our system. They're sent\nby the users of our application. When they are received you can validate its data,\nexecute some business logic, and register one or more events. Therefore, we have to define two more things:"),(0,o.kt)("ol",null,(0,o.kt)("li",{parentName:"ol"},"Who is authorized to run this command."),(0,o.kt)("li",{parentName:"ol"},"The events that it will trigger.")),(0,o.kt)("p",null,"Booster allows you to define authorization strategies (we will cover that\nlater). Let's start by allowing anyone to send this command to our application.\nTo do that, open the file we have just generated and add the string ",(0,o.kt)("inlineCode",{parentName:"p"},"'all'")," to the\n",(0,o.kt)("inlineCode",{parentName:"p"},"authorize")," parameter of the ",(0,o.kt)("inlineCode",{parentName:"p"},"@Command")," decorator. Your ",(0,o.kt)("inlineCode",{parentName:"p"},"CreatePost")," command should look like this:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"@Command({\n authorize: 'all', // Specify authorized roles here. Use 'all' to authorize anyone\n})\nexport class CreatePost {\n public constructor(\n readonly postId: UUID,\n readonly title: string,\n readonly content: string,\n readonly author: string\n ) {}\n\n public static async handle(command: CreatePost, register: Register): Promise {\n register.events(/* YOUR EVENT HERE */)\n }\n}\n")),(0,o.kt)("h3",{id:"3-first-event"},"3. First event"),(0,o.kt)("p",null,"Instead of creating, updating, or deleting objects, Booster stores data in the form of events.\nThey are records of facts and represent the source of truth. Let's generate an event called ",(0,o.kt)("inlineCode",{parentName:"p"},"PostCreated"),"\nthat will contain the initial post info:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost new:event PostCreated --fields postId:UUID title:string content:string author:string\n"))),(0,o.kt)("p",null,"The ",(0,o.kt)("inlineCode",{parentName:"p"},"new:event")," generator creates a new file under the ",(0,o.kt)("inlineCode",{parentName:"p"},"src/events")," directory.\nThe name of the file is the name of the event:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-text"},"boosted-blog\n\u2514\u2500\u2500 src\n \u2514\u2500\u2500 events\n \u2514\u2500\u2500 post-created.ts\n")),(0,o.kt)("p",null,"All events in Booster must target an entity, so we need to implement an ",(0,o.kt)("inlineCode",{parentName:"p"},"entityID"),"\nmethod. From there, we'll return the identifier of the post created, the field\n",(0,o.kt)("inlineCode",{parentName:"p"},"postID"),". This identifier will be used later by Booster to build the final state\nof the ",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," automatically. Edit the ",(0,o.kt)("inlineCode",{parentName:"p"},"entityID")," method in ",(0,o.kt)("inlineCode",{parentName:"p"},"events/post-created.ts"),"\nto return our ",(0,o.kt)("inlineCode",{parentName:"p"},"postID"),":"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"// src/events/post-created.ts\n\n@Event\nexport class PostCreated {\n public constructor(\n readonly postId: UUID,\n readonly title: string,\n readonly content: string,\n readonly author: string\n ) {}\n\n public entityID(): UUID {\n return this.postId\n }\n}\n")),(0,o.kt)("p",null,"Now that we have an event, we can edit the ",(0,o.kt)("inlineCode",{parentName:"p"},"CreatePost")," command to emit it. Let's change\nthe command's ",(0,o.kt)("inlineCode",{parentName:"p"},"handle")," method to look like this:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"// src/commands/create-post.ts::handle\npublic static async handle(command: CreatePost, register: Register): Promise {\n register.events(new PostCreated(command.postId, command.title, command.content, command.author))\n}\n")),(0,o.kt)("p",null,"Remember to import the event class correctly on the top of the file:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"import { PostCreated } from '../events/post-created'\n")),(0,o.kt)("p",null,"We can do any validation in the command handler before storing the event, for our\nexample, we'll just save the received data in the ",(0,o.kt)("inlineCode",{parentName:"p"},"PostCreated")," event."),(0,o.kt)("h3",{id:"4-first-entity"},"4. First entity"),(0,o.kt)("p",null,"So far, our ",(0,o.kt)("inlineCode",{parentName:"p"},"PostCreated")," event suggests we need a ",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," entity. Entities are a\nrepresentation of our system internal state. They are in charge of reducing (combining) all the events\nwith the same ",(0,o.kt)("inlineCode",{parentName:"p"},"entityID"),". Let's generate our ",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," entity:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost new:entity Post --fields title:string content:string author:string --reduces PostCreated\n"))),(0,o.kt)("p",null,"You should see now a new file called ",(0,o.kt)("inlineCode",{parentName:"p"},"post.ts")," in the ",(0,o.kt)("inlineCode",{parentName:"p"},"src/entities")," directory."),(0,o.kt)("p",null,"This time, besides using the ",(0,o.kt)("inlineCode",{parentName:"p"},"--fields")," flag, we use the ",(0,o.kt)("inlineCode",{parentName:"p"},"--reduces")," flag to specify the events the entity will reduce and, this way, produce the Post current state. The generator will create one ",(0,o.kt)("em",{parentName:"p"},"reducer function")," for each event we have specified (only one in this case)."),(0,o.kt)("p",null,"Reducer functions in Booster work similarly to the ",(0,o.kt)("inlineCode",{parentName:"p"},"reduce")," callbacks in Javascript: they receive an event\nand the current state of the entity, and returns the next version of the same entity.\nIn this case, when we receive a ",(0,o.kt)("inlineCode",{parentName:"p"},"PostCreated")," event, we can just return a new ",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," entity copying the fields\nfrom the event. There is no previous state of the Post as we are creating it for the first time:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"// src/entities/post.ts\n@Entity\nexport class Post {\n public constructor(public id: UUID, readonly title: string, readonly content: string, readonly author: string) {}\n\n @Reduces(PostCreated)\n public static reducePostCreated(event: PostCreated, currentPost?: Post): Post {\n return new Post(event.postId, event.title, event.content, event.author)\n }\n}\n")),(0,o.kt)("p",null,"Entities represent our domain model and can be queried from command or\nevent handlers to make business decisions or enforcing business rules."),(0,o.kt)("h3",{id:"5-first-read-model"},"5. First read model"),(0,o.kt)("p",null,"In a real application, we rarely want to make public our entire domain model (entities)\nincluding all their fields. What is more, different users may have different views of the data depending\non their permissions or their use cases. That's the goal of ",(0,o.kt)("inlineCode",{parentName:"p"},"ReadModels"),". Client applications can query or\nsubscribe to them."),(0,o.kt)("p",null,"Read models are ",(0,o.kt)("em",{parentName:"p"},"projections")," of one or more entities into a new object that is reachable through the query and subscriptions APIs. Let's generate a ",(0,o.kt)("inlineCode",{parentName:"p"},"PostReadModel")," that projects our\n",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," entity:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},"```bash boost new:read-model PostReadModel --fields title:string author:string --projects Post:id ```"),(0,o.kt)("p",null,"We have used a new flag, ",(0,o.kt)("inlineCode",{parentName:"p"},"--projects"),", that allow us to specify the entities (can be many) the read model will\nwatch for changes. You might be wondering what is the ",(0,o.kt)("inlineCode",{parentName:"p"},":id")," after the entity name. That's the ",(0,o.kt)("a",{parentName:"p",href:"/architecture/read-model#the-projection-function"},"joinKey"),",\nbut you can forget about it now."),(0,o.kt)("p",null,"As you might guess, the read-model generator will create a file called\n",(0,o.kt)("inlineCode",{parentName:"p"},"post-read-model.ts")," under ",(0,o.kt)("inlineCode",{parentName:"p"},"src/read-models"),":"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-text"},"boosted-blog\n\u2514\u2500\u2500 src\n \u2514\u2500\u2500 read-models\n \u2514\u2500\u2500 post-read-model.ts\n")),(0,o.kt)("p",null,"There are two things to do when creating a read model:"),(0,o.kt)("ol",null,(0,o.kt)("li",{parentName:"ol"},"Define who is authorized to query or subscribe it"),(0,o.kt)("li",{parentName:"ol"},"Add the logic of the projection functions, where you can filter, combine, etc., the entities fields.")),(0,o.kt)("p",null,"While commands define the input to our system, read models define the output, and together they compound\nthe public API of a Booster application. Let's do the same we did in the command and authorize ",(0,o.kt)("inlineCode",{parentName:"p"},"all")," to\nquery/subscribe the ",(0,o.kt)("inlineCode",{parentName:"p"},"PostReadModel"),". Also, and for learning purposes, we will exclude the ",(0,o.kt)("inlineCode",{parentName:"p"},"content")," field\nfrom the ",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," entity, so it won't be returned when users request the read model."),(0,o.kt)("p",null,"Edit the ",(0,o.kt)("inlineCode",{parentName:"p"},"post-read-model.ts")," file to look like this:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"// src/read-models/post-read-model.ts\n@ReadModel({\n authorize: 'all', // Specify authorized roles here. Use 'all' to authorize anyone\n})\nexport class PostReadModel {\n public constructor(public id: UUID, readonly title: string, readonly author: string) {}\n\n @Projects(Post, 'id')\n public static projectPost(entity: Post, currentPostReadModel?: PostReadModel): ProjectionResult {\n return new PostReadModel(entity.id, entity.title, entity.author)\n }\n}\n")),(0,o.kt)("h3",{id:"6-deployment"},"6. Deployment"),(0,o.kt)("p",null,"At this point, we've:"),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},"Created a publicly accessible command"),(0,o.kt)("li",{parentName:"ul"},"Emitted an event as a mechanism to store data"),(0,o.kt)("li",{parentName:"ul"},"Reduced the event into an entity to have a representation of our internal state"),(0,o.kt)("li",{parentName:"ul"},"Projected the entity into a read model that is also publicly accessible.")),(0,o.kt)("p",null,"With this, you already know the basics to build event-driven, CQRS-based applications\nwith Booster."),(0,o.kt)("p",null,"You can check that code compiles correctly by running the build command:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost build\n"))),(0,o.kt)("p",null,"You can also clean the compiled code by running:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost clean\n"))),(0,o.kt)("h4",{id:"61-running-your-application-locally"},"6.1 Running your application locally"),(0,o.kt)("p",null,"Now, let's run our application to see it working. It is as simple as running:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost start -e local\n"))),(0,o.kt)("p",null,"This will execute a local ",(0,o.kt)("inlineCode",{parentName:"p"},"Express.js")," server and will try to expose it in port ",(0,o.kt)("inlineCode",{parentName:"p"},"3000"),". You can change the port by using the ",(0,o.kt)("inlineCode",{parentName:"p"},"-p")," option:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost start -e local -p 8080\n"))),(0,o.kt)("h4",{id:"62-deploying-to-the-cloud"},"6.2 Deploying to the cloud"),(0,o.kt)("p",null,"Also, we can deploy our application to the cloud with no additional changes by running\nthe deploy command:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost deploy -e production\n"))),(0,o.kt)("p",null,"This is the Booster magic! \u2728 When running the start or the deploy commands, Booster will handle the creation of all the resources, ",(0,o.kt)("em",{parentName:"p"},"like Lambdas, API Gateway,"),' and the "glue" between them; ',(0,o.kt)("em",{parentName:"p"},"permissions, events, triggers, etc.")," It even creates a fully functional GraphQL API!"),(0,o.kt)("admonition",{type:"note"},(0,o.kt)("p",{parentName:"admonition"},"Deploy command automatically builds the project for you before performing updates in the cloud provider, so, build command it's not required beforehand.")),(0,o.kt)("blockquote",null,(0,o.kt)("p",{parentName:"blockquote"},"With ",(0,o.kt)("inlineCode",{parentName:"p"},"-e production")," we are specifying which environment we want to deploy. We'll talk about them later.")),(0,o.kt)("admonition",{type:"tip"},(0,o.kt)("p",{parentName:"admonition"},"If at this point you still don\u2019t believe everything is done, feel free to check in your provider\u2019s console. You should see, as in the AWS example below, that the stack and all the services are up and running! It will be the same for other providers. \ud83d\ude80")),(0,o.kt)("p",null,(0,o.kt)("img",{alt:"resources",src:n(2822).Z,width:"2726",height:"1276"})),(0,o.kt)("p",null,"When deploying, it will take a couple of minutes to deploy all the resources. Once finished, you will see\ninformation about your application endpoints and other outputs. For this example, we will\nonly need to pick the output ending in ",(0,o.kt)("inlineCode",{parentName:"p"},"httpURL"),", e.g.:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-text"},"https://.execute-api.us-east-1.amazonaws.com/production\n")),(0,o.kt)("admonition",{type:"note"},(0,o.kt)("p",{parentName:"admonition"},"By default, the full error stack trace is send to a local file, ",(0,o.kt)("inlineCode",{parentName:"p"},"./errors.log"),". To see the full error stack trace directly from the console, use the ",(0,o.kt)("inlineCode",{parentName:"p"},"--verbose")," flag.")),(0,o.kt)("h3",{id:"7-testing"},"7. Testing"),(0,o.kt)("p",null,"Let's get started testing the project. We will perform three actions:"),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},"Add a couple of posts"),(0,o.kt)("li",{parentName:"ul"},"Retrieve all posts"),(0,o.kt)("li",{parentName:"ul"},"Retrieve a specific post")),(0,o.kt)("p",null,"Booster applications provide you with a GraphQL API out of the box. You send commands using\n",(0,o.kt)("em",{parentName:"p"},"mutations")," and get read models data using ",(0,o.kt)("em",{parentName:"p"},"queries")," or ",(0,o.kt)("em",{parentName:"p"},"subscriptions"),"."),(0,o.kt)("p",null,"In this section, we will be sending requests by hand using the free ",(0,o.kt)("a",{parentName:"p",href:"https://altair.sirmuel.design/"},"Altair")," GraphQL client,\nwhich is very simple and straightforward for this guide. However, you can use any client you want. Your endpoint URL should look like this:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-text"},"/graphql\n")),(0,o.kt)("h4",{id:"71-creating-posts"},"7.1 Creating posts"),(0,o.kt)("p",null,"Let's use two mutations to send two ",(0,o.kt)("inlineCode",{parentName:"p"},"CreatePost")," commands."),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-graphql"},'mutation {\n CreatePost(\n input: {\n postId: "95ddb544-4a60-439f-a0e4-c57e806f2f6e"\n title: "Build a blog in 10 minutes with Booster"\n content: "I am so excited to write my first post"\n author: "Boosted developer"\n }\n )\n}\n')),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-graphql"},'mutation {\n CreatePost(\n input: {\n postId: "05670e55-fd31-490e-b585-3a0096db0412"\n title: "Booster framework rocks"\n content: "I am so excited for writing the second post"\n author: "Another boosted developer"\n }\n )\n}\n')),(0,o.kt)("p",null,"The expected response for each of those requests should be:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "CreatePost": true\n }\n}\n')),(0,o.kt)("admonition",{type:"note"},(0,o.kt)("p",{parentName:"admonition"},"In this example, the IDs are generated on the client-side. When running production applications consider adding validation for ID uniqueness. For this example, we have used ",(0,o.kt)("a",{parentName:"p",href:"https://www.uuidgenerator.net/version4"},"a UUID generator"))),(0,o.kt)("h4",{id:"72-retrieving-all-posts"},"7.2 Retrieving all posts"),(0,o.kt)("p",null,"Let's perform a GraphQL ",(0,o.kt)("inlineCode",{parentName:"p"},"query")," that will be hitting our ",(0,o.kt)("inlineCode",{parentName:"p"},"PostReadModel"),":"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-graphql"},"query {\n PostReadModels {\n id\n title\n author\n }\n}\n")),(0,o.kt)("p",null,"It should respond with something like:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "PostReadModels": [\n {\n "id": "05670e55-fd31-490e-b585-3a0096db0412",\n "title": "Booster framework rocks",\n "author": "Another boosted developer"\n },\n {\n "id": "95ddb544-4a60-439f-a0e4-c57e806f2f6e",\n "title": "Build a blog in 10 minutes with Booster",\n "author": "Boosted developer"\n }\n ]\n }\n}\n')),(0,o.kt)("h4",{id:"73-retrieving-specific-post"},"7.3 Retrieving specific post"),(0,o.kt)("p",null,"It is also possible to retrieve specific a ",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," by adding the ",(0,o.kt)("inlineCode",{parentName:"p"},"id")," as input, e.g.:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-graphql"},'query {\n PostReadModel(id: "95ddb544-4a60-439f-a0e4-c57e806f2f6e") {\n id\n title\n author\n }\n}\n')),(0,o.kt)("p",null,"You should get a response similar to this:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "PostReadModel": {\n "id": "95ddb544-4a60-439f-a0e4-c57e806f2f6e",\n "title": "Build a blog in 10 minutes with Booster",\n "author": "Boosted developer"\n }\n }\n}\n')),(0,o.kt)("h3",{id:"8-removing-the-stack"},"8. Removing the stack"),(0,o.kt)("p",null,"It is convenient to destroy all the infrastructure created after you stop using\nit to avoid generating cloud resource costs. Execute the following command from\nthe root of the project. For safety reasons, you have to confirm this action by\nwriting the project's name, in our case ",(0,o.kt)("inlineCode",{parentName:"p"},"boosted-blog")," that is the same used when\nwe run ",(0,o.kt)("inlineCode",{parentName:"p"},"new:project")," CLI command."),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"> boost nuke -e production\n\n? Please, enter the app name to confirm deletion of all resources: boosted-blog\n"))),(0,o.kt)("blockquote",null,(0,o.kt)("p",{parentName:"blockquote"},"Congratulations! You've built a serverless backend in less than 10 minutes. We hope you have enjoyed discovering the magic of the Booster Framework.")),(0,o.kt)("h3",{id:"9-more-functionalities"},"9. More functionalities"),(0,o.kt)("p",null,"This is a really basic example of a Booster application. The are many other features Booster provides like:"),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},"Use a more complex authorization schema for commands and read models based on user roles"),(0,o.kt)("li",{parentName:"ul"},"Use GraphQL subscriptions to get updates in real-time"),(0,o.kt)("li",{parentName:"ul"},"Make events trigger other events"),(0,o.kt)("li",{parentName:"ul"},"Deploy static content"),(0,o.kt)("li",{parentName:"ul"},"Reading entities within command handlers to apply domain-driven decisions"),(0,o.kt)("li",{parentName:"ul"},"And much more...")),(0,o.kt)("p",null,"Continue reading to dig more. You've just scratched the surface of all the Booster\ncapabilities!"),(0,o.kt)("h2",{id:"examples-and-walkthroughs"},"Examples and walkthroughs"),(0,o.kt)("h3",{id:"creation-of-a-question-asking-application-backend"},"Creation of a question-asking application backend"),(0,o.kt)("p",null,"In the following video, you will find how to create a backend for a question-asking application from scratch. This application would allow\nusers to create questions and like them. This video goes from creating the project to incrementally deploying features in the application.\nYou can find the code both for the frontend and the backend in ",(0,o.kt)(i.do,{mdxType:"CLAskMeRepo"},(0,o.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/examples/tree/master/askme"},"this GitHub repo")),"."),(0,o.kt)("div",{align:"center"},(0,o.kt)("iframe",{width:"560",height:"315",src:"https://www.youtube.com/embed/C4K2M-orT8k",title:"YouTube video player",frameBorder:"0",allow:"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture",allowFullScreen:!0})),(0,o.kt)("h3",{id:"all-the-guides-and-examples"},"All the guides and examples"),(0,o.kt)("p",null,"Check out the ",(0,o.kt)(i.dM,{mdxType:"CLExampleApps"},(0,o.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/examples"},"example apps repository"))," to see Booster in use."))}m.isMDXComponent=!0},2822:(e,t,n)=>{n.d(t,{Z:()=>a});const a=n.p+"assets/images/aws-resources-e620ed48140a022aae2ca68d0c52b496.png"}}]); \ No newline at end of file diff --git a/assets/js/3c6e0dde.75ab9ffc.js b/assets/js/3c6e0dde.75ab9ffc.js new file mode 100644 index 000000000..4e71b87cf --- /dev/null +++ b/assets/js/3c6e0dde.75ab9ffc.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[4454],{3905:(e,t,n)=>{n.d(t,{Zo:()=>d,kt:()=>m});var a=n(7294);function o(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function r(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);t&&(a=a.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,a)}return n}function i(e){for(var t=1;t=0||(o[n]=e[n]);return o}(e,t);if(Object.getOwnPropertySymbols){var r=Object.getOwnPropertySymbols(e);for(a=0;a=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(o[n]=e[n])}return o}var s=a.createContext({}),p=function(e){var t=a.useContext(s),n=t;return e&&(n="function"==typeof e?e(t):i(i({},t),e)),n},d=function(e){var t=p(e.components);return a.createElement(s.Provider,{value:t},e.children)},c={inlineCode:"code",wrapper:function(e){var t=e.children;return a.createElement(a.Fragment,{},t)}},u=a.forwardRef((function(e,t){var n=e.components,o=e.mdxType,r=e.originalType,s=e.parentName,d=l(e,["components","mdxType","originalType","parentName"]),u=p(n),m=o,h=u["".concat(s,".").concat(m)]||u[m]||c[m]||r;return n?a.createElement(h,i(i({ref:t},d),{},{components:n})):a.createElement(h,i({ref:t},d))}));function m(e,t){var n=arguments,o=t&&t.mdxType;if("string"==typeof e||o){var r=n.length,i=new Array(r);i[0]=u;var l={};for(var s in t)hasOwnProperty.call(t,s)&&(l[s]=t[s]);l.originalType=e,l.mdxType="string"==typeof e?e:o,i[1]=l;for(var p=2;p{n.d(t,{do:()=>l,dM:()=>i,Dh:()=>s});var a=n(7294),o=n(719);const r=e=>{let{href:t,onClick:n,children:o}=e;return a.createElement("a",{href:t,target:"_blank",rel:"noopener noreferrer",onClick:e=>{n&&n()}},o)},i=e=>{let{children:t}=e;return p(t,"YY7T3ZSZ")},l=e=>{let{children:t}=e;return p(t,"NE1EADCK")},s=e=>{let{children:t}=e;return p(t,"AXTW7ICE")};function p(e,t){const{text:n,href:i}=function(e){if(a.isValidElement(e)&&e.props.href)return{text:e.props.children,href:e.props.href};return{text:"",href:""}}(e);return a.createElement(r,{href:i,onClick:()=>o.R.startAndTrackEvent(t)},n)}},5163:(e,t,n)=>{n.d(t,{Z:()=>p});var a=n(7294);const o="terminalWindow_wGrl",r="terminalWindowHeader_o9Cs",i="buttons_IGLB",l="dot_fGZE",s="terminalWindowBody_tzdS";function p(e){let{children:t}=e;return a.createElement("div",{className:o},a.createElement("div",{className:r},a.createElement("div",{className:i},a.createElement("span",{className:l,style:{background:"#f25f58"}}),a.createElement("span",{className:l,style:{background:"#fbbe3c"}}),a.createElement("span",{className:l,style:{background:"#58cb42"}}))),a.createElement("div",{className:s},t))}},4792:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>d,contentTitle:()=>s,default:()=>m,frontMatter:()=>l,metadata:()=>p,toc:()=>c});var a=n(7462),o=(n(7294),n(3905)),r=n(5163),i=n(2735);const l={description:"How to have the backend up and running for a blog application in a few minutes"},s="Build a Booster app in minutes",p={unversionedId:"getting-started/coding",id:"getting-started/coding",title:"Build a Booster app in minutes",description:"How to have the backend up and running for a blog application in a few minutes",source:"@site/docs/02_getting-started/coding.mdx",sourceDirName:"02_getting-started",slug:"/getting-started/coding",permalink:"/getting-started/coding",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/02_getting-started/coding.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",frontMatter:{description:"How to have the backend up and running for a blog application in a few minutes"},sidebar:"docs",previous:{title:"Installation",permalink:"/getting-started/installation"},next:{title:"Booster architecture",permalink:"/architecture/event-driven"}},d={},c=[{value:"1. Create the project",id:"1-create-the-project",level:3},{value:"2. First command",id:"2-first-command",level:3},{value:"3. First event",id:"3-first-event",level:3},{value:"4. First entity",id:"4-first-entity",level:3},{value:"5. First read model",id:"5-first-read-model",level:3},{value:"6. Deployment",id:"6-deployment",level:3},{value:"6.1 Running your application locally",id:"61-running-your-application-locally",level:4},{value:"6.2 Deploying to the cloud",id:"62-deploying-to-the-cloud",level:4},{value:"7. Testing",id:"7-testing",level:3},{value:"7.1 Creating posts",id:"71-creating-posts",level:4},{value:"7.2 Retrieving all posts",id:"72-retrieving-all-posts",level:4},{value:"7.3 Retrieving specific post",id:"73-retrieving-specific-post",level:4},{value:"8. Removing the stack",id:"8-removing-the-stack",level:3},{value:"9. More functionalities",id:"9-more-functionalities",level:3},{value:"Examples and walkthroughs",id:"examples-and-walkthroughs",level:2},{value:"Creation of a question-asking application backend",id:"creation-of-a-question-asking-application-backend",level:3},{value:"All the guides and examples",id:"all-the-guides-and-examples",level:3}],u={toc:c};function m(e){let{components:t,...l}=e;return(0,o.kt)("wrapper",(0,a.Z)({},u,l,{components:t,mdxType:"MDXLayout"}),(0,o.kt)("h1",{id:"build-a-booster-app-in-minutes"},"Build a Booster app in minutes"),(0,o.kt)("p",null,"In this section, we will go through all the necessary steps to have the backend up and\nrunning for a blog application in just a few minutes."),(0,o.kt)("p",null,"Before starting, make sure to ",(0,o.kt)("a",{parentName:"p",href:"/getting-started/installation"},"have Booster CLI installed"),". If you also want to deploy your application to your cloud provider, check out the ",(0,o.kt)("a",{parentName:"p",href:"../going-deeper/infrastructure-providers"},"Provider configuration")," section."),(0,o.kt)("h3",{id:"1-create-the-project"},"1. Create the project"),(0,o.kt)("p",null,"First of all, we will use the Booster CLI tool generators to create a project."),(0,o.kt)("p",null,"In your favourite terminal, run this command ",(0,o.kt)("inlineCode",{parentName:"p"},"boost new:project boosted-blog")," and follow\nthe instructions. After some prompted questions, the CLI will ask you to select one of the available providers to set up as the main provider that will be used."),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-shell"},"? What's the package name of your provider infrastructure library? (Use arrow keys)\n\u276f @boostercloud/framework-provider-aws (AWS)\n @boostercloud/framework-provider-azure (Azure)\n Other\n"))),(0,o.kt)("p",null,"When asked for the provider, select AWS as that is what we have\nconfigured ",(0,o.kt)("a",{parentName:"p",href:"../going-deeper/infrastructure-providers#aws-provider-setup"},"here")," for the example. You can use another provider if you want, or add more providers once you have created the project."),(0,o.kt)("p",null,"If you don't know what provider you are going to use, and you just want to execute your Booster application locally, you can select one and change it later!"),(0,o.kt)("p",null,"After choosing your provider, you will see your project generated!:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-shell"},"> boost new:project boosted-blog\n\n...\n\n\u2139 boost new \ud83d\udea7\n\u2714 Creating project root\n\u2714 Generating config files\n\u2714 Installing dependencies\n\u2139 Project generated!\n"))),(0,o.kt)("admonition",{type:"tip"},(0,o.kt)("p",{parentName:"admonition"},"If you prefer to create the project with default parameters, you can run the command as ",(0,o.kt)("inlineCode",{parentName:"p"},"boost new:project booster-blog --default"),". The default\nparameters are as follows:"),(0,o.kt)("ul",{parentName:"admonition"},(0,o.kt)("li",{parentName:"ul"},'Project name: The one provided when running the command, in this case "booster-blog"'),(0,o.kt)("li",{parentName:"ul"},"Provider: AWS"),(0,o.kt)("li",{parentName:"ul"},'Description, author, homepage and repository: ""'),(0,o.kt)("li",{parentName:"ul"},"License: MIT"),(0,o.kt)("li",{parentName:"ul"},"Version: 0.1.0"))),(0,o.kt)("p",null,"In case you want to specify each parameter without following the instructions, you can use the following flags with this structure ",(0,o.kt)("inlineCode",{parentName:"p"},"="),"."),(0,o.kt)("table",null,(0,o.kt)("thead",{parentName:"table"},(0,o.kt)("tr",{parentName:"thead"},(0,o.kt)("th",{parentName:"tr",align:"left"},"Flag"),(0,o.kt)("th",{parentName:"tr",align:"left"},"Short version"),(0,o.kt)("th",{parentName:"tr",align:"left"},"Description"))),(0,o.kt)("tbody",{parentName:"table"},(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--homepage")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-H")),(0,o.kt)("td",{parentName:"tr",align:"left"},"The website of this project")),(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--author")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-a")),(0,o.kt)("td",{parentName:"tr",align:"left"},"Author of this project")),(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--description")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-d")),(0,o.kt)("td",{parentName:"tr",align:"left"},"A short description")),(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--license")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-l")),(0,o.kt)("td",{parentName:"tr",align:"left"},"License used in this project")),(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--providerPackageName")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-p")),(0,o.kt)("td",{parentName:"tr",align:"left"},"Package name implementing the cloud provider integration where the application will be deployed")),(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--repository")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-r")),(0,o.kt)("td",{parentName:"tr",align:"left"},"The URL of the repository")),(0,o.kt)("tr",{parentName:"tbody"},(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"--version")),(0,o.kt)("td",{parentName:"tr",align:"left"},(0,o.kt)("inlineCode",{parentName:"td"},"-v")),(0,o.kt)("td",{parentName:"tr",align:"left"},"The initial version")))),(0,o.kt)("p",null,"Additionally, you can use the ",(0,o.kt)("inlineCode",{parentName:"p"},"--skipInstall")," flag if you want to skip installing dependencies and the ",(0,o.kt)("inlineCode",{parentName:"p"},"--skipGit")," flag in case you want to skip git initialization."),(0,o.kt)("blockquote",null,(0,o.kt)("p",{parentName:"blockquote"},"Booster CLI commands follow this structure: ",(0,o.kt)("inlineCode",{parentName:"p"},"boost [] []"),".\nLet's break down the command we have just executed:"),(0,o.kt)("ul",{parentName:"blockquote"},(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"boost")," is the Booster CLI executable"),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"new:project"),' is the "subcommand" part. In this case, it is composed of two parts separated by a colon. The first part, ',(0,o.kt)("inlineCode",{parentName:"li"},"new"),", means that we want to generate a new resource. The second part, ",(0,o.kt)("inlineCode",{parentName:"li"},"project"),", indicates which kind of resource we are interested in. Other examples are ",(0,o.kt)("inlineCode",{parentName:"li"},"new:command"),", ",(0,o.kt)("inlineCode",{parentName:"li"},"new:event"),", etc. We'll see a bunch of them later."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"boosted-blog"),' is a "parameter" for the subcommand ',(0,o.kt)("inlineCode",{parentName:"li"},"new:project"),". Flags and parameters are optional and their meaning and shape depend on the subcommand you used. In this case, we are specifying the name of the project we are creating."))),(0,o.kt)("admonition",{type:"tip"},(0,o.kt)("p",{parentName:"admonition"},"You can always use the ",(0,o.kt)("inlineCode",{parentName:"p"},"--help")," flag to get all the available options for each cli command.")),(0,o.kt)("p",null,"When finished, you'll see some scaffolding that has been generated. The project name will be the\nproject's root so ",(0,o.kt)("inlineCode",{parentName:"p"},"cd")," into it:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-shell"},"cd boosted-blog\n"))),(0,o.kt)("p",null,"There you should have these files and directories already generated:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-text"},"boosted-blog\n\u251c\u2500\u2500 .eslintignore\n\u251c\u2500\u2500 .gitignore\n\u251c\u2500\u2500 .eslintrc.js\n\u251c\u2500\u2500 .prettierrc.yaml\n\u251c\u2500\u2500 package-lock.json\n\u251c\u2500\u2500 package.json\n\u251c\u2500\u2500 src\n\u2502 \u251c\u2500\u2500 commands\n\u2502 \u251c\u2500\u2500 common\n\u2502 \u251c\u2500\u2500 config\n\u2502 \u2502 \u2514\u2500\u2500 config.ts\n\u2502 \u251c\u2500\u2500 entities\n\u2502 \u251c\u2500\u2500 events\n\u2502 \u251c\u2500\u2500 event-handlers\n\u2502 \u251c\u2500\u2500 read-models\n\u2502 \u2514\u2500\u2500 index.ts\n\u251c\u2500\u2500 tsconfig.eslint.json\n\u2514\u2500\u2500 tsconfig.json\n")),(0,o.kt)("p",null,"Now open the project in your favorite editor, e.g. ",(0,o.kt)("a",{parentName:"p",href:"https://code.visualstudio.com/"},"Visual Studio Code"),"."),(0,o.kt)("h3",{id:"2-first-command"},"2. First command"),(0,o.kt)("p",null,"Commands define the input to our system, so we'll start by generating our first\n",(0,o.kt)("a",{parentName:"p",href:"/architecture/command"},"command")," to create posts. Use the command generator, while in the project's root\ndirectory, as follows:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost new:command CreatePost --fields postId:UUID title:string content:string author:string\n"))),(0,o.kt)("p",null,"The ",(0,o.kt)("inlineCode",{parentName:"p"},"new:command")," generator creates a ",(0,o.kt)("inlineCode",{parentName:"p"},"create-post.ts")," file in the ",(0,o.kt)("inlineCode",{parentName:"p"},"commands")," folder:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-text"},"boosted-blog\n\u2514\u2500\u2500 src\n \u2514\u2500\u2500 commands\n \u2514\u2500\u2500 create-post.ts\n")),(0,o.kt)("p",null,"As we mentioned before, commands are the input of our system. They're sent\nby the users of our application. When they are received you can validate its data,\nexecute some business logic, and register one or more events. Therefore, we have to define two more things:"),(0,o.kt)("ol",null,(0,o.kt)("li",{parentName:"ol"},"Who is authorized to run this command."),(0,o.kt)("li",{parentName:"ol"},"The events that it will trigger.")),(0,o.kt)("p",null,"Booster allows you to define authorization strategies (we will cover that\nlater). Let's start by allowing anyone to send this command to our application.\nTo do that, open the file we have just generated and add the string ",(0,o.kt)("inlineCode",{parentName:"p"},"'all'")," to the\n",(0,o.kt)("inlineCode",{parentName:"p"},"authorize")," parameter of the ",(0,o.kt)("inlineCode",{parentName:"p"},"@Command")," decorator. Your ",(0,o.kt)("inlineCode",{parentName:"p"},"CreatePost")," command should look like this:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"@Command({\n authorize: 'all', // Specify authorized roles here. Use 'all' to authorize anyone\n})\nexport class CreatePost {\n public constructor(\n readonly postId: UUID,\n readonly title: string,\n readonly content: string,\n readonly author: string\n ) {}\n\n public static async handle(command: CreatePost, register: Register): Promise {\n register.events(/* YOUR EVENT HERE */)\n }\n}\n")),(0,o.kt)("h3",{id:"3-first-event"},"3. First event"),(0,o.kt)("p",null,"Instead of creating, updating, or deleting objects, Booster stores data in the form of events.\nThey are records of facts and represent the source of truth. Let's generate an event called ",(0,o.kt)("inlineCode",{parentName:"p"},"PostCreated"),"\nthat will contain the initial post info:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost new:event PostCreated --fields postId:UUID title:string content:string author:string\n"))),(0,o.kt)("p",null,"The ",(0,o.kt)("inlineCode",{parentName:"p"},"new:event")," generator creates a new file under the ",(0,o.kt)("inlineCode",{parentName:"p"},"src/events")," directory.\nThe name of the file is the name of the event:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-text"},"boosted-blog\n\u2514\u2500\u2500 src\n \u2514\u2500\u2500 events\n \u2514\u2500\u2500 post-created.ts\n")),(0,o.kt)("p",null,"All events in Booster must target an entity, so we need to implement an ",(0,o.kt)("inlineCode",{parentName:"p"},"entityID"),"\nmethod. From there, we'll return the identifier of the post created, the field\n",(0,o.kt)("inlineCode",{parentName:"p"},"postID"),". This identifier will be used later by Booster to build the final state\nof the ",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," automatically. Edit the ",(0,o.kt)("inlineCode",{parentName:"p"},"entityID")," method in ",(0,o.kt)("inlineCode",{parentName:"p"},"events/post-created.ts"),"\nto return our ",(0,o.kt)("inlineCode",{parentName:"p"},"postID"),":"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"// src/events/post-created.ts\n\n@Event\nexport class PostCreated {\n public constructor(\n readonly postId: UUID,\n readonly title: string,\n readonly content: string,\n readonly author: string\n ) {}\n\n public entityID(): UUID {\n return this.postId\n }\n}\n")),(0,o.kt)("p",null,"Now that we have an event, we can edit the ",(0,o.kt)("inlineCode",{parentName:"p"},"CreatePost")," command to emit it. Let's change\nthe command's ",(0,o.kt)("inlineCode",{parentName:"p"},"handle")," method to look like this:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"// src/commands/create-post.ts::handle\npublic static async handle(command: CreatePost, register: Register): Promise {\n register.events(new PostCreated(command.postId, command.title, command.content, command.author))\n}\n")),(0,o.kt)("p",null,"Remember to import the event class correctly on the top of the file:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"import { PostCreated } from '../events/post-created'\n")),(0,o.kt)("p",null,"We can do any validation in the command handler before storing the event, for our\nexample, we'll just save the received data in the ",(0,o.kt)("inlineCode",{parentName:"p"},"PostCreated")," event."),(0,o.kt)("h3",{id:"4-first-entity"},"4. First entity"),(0,o.kt)("p",null,"So far, our ",(0,o.kt)("inlineCode",{parentName:"p"},"PostCreated")," event suggests we need a ",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," entity. Entities are a\nrepresentation of our system internal state. They are in charge of reducing (combining) all the events\nwith the same ",(0,o.kt)("inlineCode",{parentName:"p"},"entityID"),". Let's generate our ",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," entity:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost new:entity Post --fields title:string content:string author:string --reduces PostCreated\n"))),(0,o.kt)("p",null,"You should see now a new file called ",(0,o.kt)("inlineCode",{parentName:"p"},"post.ts")," in the ",(0,o.kt)("inlineCode",{parentName:"p"},"src/entities")," directory."),(0,o.kt)("p",null,"This time, besides using the ",(0,o.kt)("inlineCode",{parentName:"p"},"--fields")," flag, we use the ",(0,o.kt)("inlineCode",{parentName:"p"},"--reduces")," flag to specify the events the entity will reduce and, this way, produce the Post current state. The generator will create one ",(0,o.kt)("em",{parentName:"p"},"reducer function")," for each event we have specified (only one in this case)."),(0,o.kt)("p",null,"Reducer functions in Booster work similarly to the ",(0,o.kt)("inlineCode",{parentName:"p"},"reduce")," callbacks in Javascript: they receive an event\nand the current state of the entity, and returns the next version of the same entity.\nIn this case, when we receive a ",(0,o.kt)("inlineCode",{parentName:"p"},"PostCreated")," event, we can just return a new ",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," entity copying the fields\nfrom the event. There is no previous state of the Post as we are creating it for the first time:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"// src/entities/post.ts\n@Entity\nexport class Post {\n public constructor(public id: UUID, readonly title: string, readonly content: string, readonly author: string) {}\n\n @Reduces(PostCreated)\n public static reducePostCreated(event: PostCreated, currentPost?: Post): Post {\n return new Post(event.postId, event.title, event.content, event.author)\n }\n}\n")),(0,o.kt)("p",null,"Entities represent our domain model and can be queried from command or\nevent handlers to make business decisions or enforcing business rules."),(0,o.kt)("h3",{id:"5-first-read-model"},"5. First read model"),(0,o.kt)("p",null,"In a real application, we rarely want to make public our entire domain model (entities)\nincluding all their fields. What is more, different users may have different views of the data depending\non their permissions or their use cases. That's the goal of ",(0,o.kt)("inlineCode",{parentName:"p"},"ReadModels"),". Client applications can query or\nsubscribe to them."),(0,o.kt)("p",null,"Read models are ",(0,o.kt)("em",{parentName:"p"},"projections")," of one or more entities into a new object that is reachable through the query and subscriptions APIs. Let's generate a ",(0,o.kt)("inlineCode",{parentName:"p"},"PostReadModel")," that projects our\n",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," entity:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},"```bash boost new:read-model PostReadModel --fields title:string author:string --projects Post:id ```"),(0,o.kt)("p",null,"We have used a new flag, ",(0,o.kt)("inlineCode",{parentName:"p"},"--projects"),", that allow us to specify the entities (can be many) the read model will\nwatch for changes. You might be wondering what is the ",(0,o.kt)("inlineCode",{parentName:"p"},":id")," after the entity name. That's the ",(0,o.kt)("a",{parentName:"p",href:"/architecture/read-model#the-projection-function"},"joinKey"),",\nbut you can forget about it now."),(0,o.kt)("p",null,"As you might guess, the read-model generator will create a file called\n",(0,o.kt)("inlineCode",{parentName:"p"},"post-read-model.ts")," under ",(0,o.kt)("inlineCode",{parentName:"p"},"src/read-models"),":"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-text"},"boosted-blog\n\u2514\u2500\u2500 src\n \u2514\u2500\u2500 read-models\n \u2514\u2500\u2500 post-read-model.ts\n")),(0,o.kt)("p",null,"There are two things to do when creating a read model:"),(0,o.kt)("ol",null,(0,o.kt)("li",{parentName:"ol"},"Define who is authorized to query or subscribe it"),(0,o.kt)("li",{parentName:"ol"},"Add the logic of the projection functions, where you can filter, combine, etc., the entities fields.")),(0,o.kt)("p",null,"While commands define the input to our system, read models define the output, and together they compound\nthe public API of a Booster application. Let's do the same we did in the command and authorize ",(0,o.kt)("inlineCode",{parentName:"p"},"all")," to\nquery/subscribe the ",(0,o.kt)("inlineCode",{parentName:"p"},"PostReadModel"),". Also, and for learning purposes, we will exclude the ",(0,o.kt)("inlineCode",{parentName:"p"},"content")," field\nfrom the ",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," entity, so it won't be returned when users request the read model."),(0,o.kt)("p",null,"Edit the ",(0,o.kt)("inlineCode",{parentName:"p"},"post-read-model.ts")," file to look like this:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"// src/read-models/post-read-model.ts\n@ReadModel({\n authorize: 'all', // Specify authorized roles here. Use 'all' to authorize anyone\n})\nexport class PostReadModel {\n public constructor(public id: UUID, readonly title: string, readonly author: string) {}\n\n @Projects(Post, 'id')\n public static projectPost(entity: Post, currentPostReadModel?: PostReadModel): ProjectionResult {\n return new PostReadModel(entity.id, entity.title, entity.author)\n }\n}\n")),(0,o.kt)("h3",{id:"6-deployment"},"6. Deployment"),(0,o.kt)("p",null,"At this point, we've:"),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},"Created a publicly accessible command"),(0,o.kt)("li",{parentName:"ul"},"Emitted an event as a mechanism to store data"),(0,o.kt)("li",{parentName:"ul"},"Reduced the event into an entity to have a representation of our internal state"),(0,o.kt)("li",{parentName:"ul"},"Projected the entity into a read model that is also publicly accessible.")),(0,o.kt)("p",null,"With this, you already know the basics to build event-driven, CQRS-based applications\nwith Booster."),(0,o.kt)("p",null,"You can check that code compiles correctly by running the build command:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost build\n"))),(0,o.kt)("p",null,"You can also clean the compiled code by running:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost clean\n"))),(0,o.kt)("h4",{id:"61-running-your-application-locally"},"6.1 Running your application locally"),(0,o.kt)("p",null,"Now, let's run our application to see it working. It is as simple as running:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost start -e local\n"))),(0,o.kt)("p",null,"This will execute a local ",(0,o.kt)("inlineCode",{parentName:"p"},"Express.js")," server and will try to expose it in port ",(0,o.kt)("inlineCode",{parentName:"p"},"3000"),". You can change the port by using the ",(0,o.kt)("inlineCode",{parentName:"p"},"-p")," option:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost start -e local -p 8080\n"))),(0,o.kt)("h4",{id:"62-deploying-to-the-cloud"},"6.2 Deploying to the cloud"),(0,o.kt)("p",null,"Also, we can deploy our application to the cloud with no additional changes by running\nthe deploy command:"),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"boost deploy -e production\n"))),(0,o.kt)("p",null,"This is the Booster magic! \u2728 When running the start or the deploy commands, Booster will handle the creation of all the resources, ",(0,o.kt)("em",{parentName:"p"},"like Lambdas, API Gateway,"),' and the "glue" between them; ',(0,o.kt)("em",{parentName:"p"},"permissions, events, triggers, etc.")," It even creates a fully functional GraphQL API!"),(0,o.kt)("admonition",{type:"note"},(0,o.kt)("p",{parentName:"admonition"},"Deploy command automatically builds the project for you before performing updates in the cloud provider, so, build command it's not required beforehand.")),(0,o.kt)("blockquote",null,(0,o.kt)("p",{parentName:"blockquote"},"With ",(0,o.kt)("inlineCode",{parentName:"p"},"-e production")," we are specifying which environment we want to deploy. We'll talk about them later.")),(0,o.kt)("admonition",{type:"tip"},(0,o.kt)("p",{parentName:"admonition"},"If at this point you still don\u2019t believe everything is done, feel free to check in your provider\u2019s console. You should see, as in the AWS example below, that the stack and all the services are up and running! It will be the same for other providers. \ud83d\ude80")),(0,o.kt)("p",null,(0,o.kt)("img",{alt:"resources",src:n(2822).Z,width:"2726",height:"1276"})),(0,o.kt)("p",null,"When deploying, it will take a couple of minutes to deploy all the resources. Once finished, you will see\ninformation about your application endpoints and other outputs. For this example, we will\nonly need to pick the output ending in ",(0,o.kt)("inlineCode",{parentName:"p"},"httpURL"),", e.g.:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-text"},"https://.execute-api.us-east-1.amazonaws.com/production\n")),(0,o.kt)("admonition",{type:"note"},(0,o.kt)("p",{parentName:"admonition"},"By default, the full error stack trace is send to a local file, ",(0,o.kt)("inlineCode",{parentName:"p"},"./errors.log"),". To see the full error stack trace directly from the console, use the ",(0,o.kt)("inlineCode",{parentName:"p"},"--verbose")," flag.")),(0,o.kt)("h3",{id:"7-testing"},"7. Testing"),(0,o.kt)("p",null,"Let's get started testing the project. We will perform three actions:"),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},"Add a couple of posts"),(0,o.kt)("li",{parentName:"ul"},"Retrieve all posts"),(0,o.kt)("li",{parentName:"ul"},"Retrieve a specific post")),(0,o.kt)("p",null,"Booster applications provide you with a GraphQL API out of the box. You send commands using\n",(0,o.kt)("em",{parentName:"p"},"mutations")," and get read models data using ",(0,o.kt)("em",{parentName:"p"},"queries")," or ",(0,o.kt)("em",{parentName:"p"},"subscriptions"),"."),(0,o.kt)("p",null,"In this section, we will be sending requests by hand using the free ",(0,o.kt)("a",{parentName:"p",href:"https://altair.sirmuel.design/"},"Altair")," GraphQL client,\nwhich is very simple and straightforward for this guide. However, you can use any client you want. Your endpoint URL should look like this:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-text"},"/graphql\n")),(0,o.kt)("h4",{id:"71-creating-posts"},"7.1 Creating posts"),(0,o.kt)("p",null,"Let's use two mutations to send two ",(0,o.kt)("inlineCode",{parentName:"p"},"CreatePost")," commands."),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-graphql"},'mutation {\n CreatePost(\n input: {\n postId: "95ddb544-4a60-439f-a0e4-c57e806f2f6e"\n title: "Build a blog in 10 minutes with Booster"\n content: "I am so excited to write my first post"\n author: "Boosted developer"\n }\n )\n}\n')),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-graphql"},'mutation {\n CreatePost(\n input: {\n postId: "05670e55-fd31-490e-b585-3a0096db0412"\n title: "Booster framework rocks"\n content: "I am so excited for writing the second post"\n author: "Another boosted developer"\n }\n )\n}\n')),(0,o.kt)("p",null,"The expected response for each of those requests should be:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "CreatePost": true\n }\n}\n')),(0,o.kt)("admonition",{type:"note"},(0,o.kt)("p",{parentName:"admonition"},"In this example, the IDs are generated on the client-side. When running production applications consider adding validation for ID uniqueness. For this example, we have used ",(0,o.kt)("a",{parentName:"p",href:"https://www.uuidgenerator.net/version4"},"a UUID generator"))),(0,o.kt)("h4",{id:"72-retrieving-all-posts"},"7.2 Retrieving all posts"),(0,o.kt)("p",null,"Let's perform a GraphQL ",(0,o.kt)("inlineCode",{parentName:"p"},"query")," that will be hitting our ",(0,o.kt)("inlineCode",{parentName:"p"},"PostReadModel"),":"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-graphql"},"query {\n PostReadModels {\n id\n title\n author\n }\n}\n")),(0,o.kt)("p",null,"It should respond with something like:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "PostReadModels": [\n {\n "id": "05670e55-fd31-490e-b585-3a0096db0412",\n "title": "Booster framework rocks",\n "author": "Another boosted developer"\n },\n {\n "id": "95ddb544-4a60-439f-a0e4-c57e806f2f6e",\n "title": "Build a blog in 10 minutes with Booster",\n "author": "Boosted developer"\n }\n ]\n }\n}\n')),(0,o.kt)("h4",{id:"73-retrieving-specific-post"},"7.3 Retrieving specific post"),(0,o.kt)("p",null,"It is also possible to retrieve specific a ",(0,o.kt)("inlineCode",{parentName:"p"},"Post")," by adding the ",(0,o.kt)("inlineCode",{parentName:"p"},"id")," as input, e.g.:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-graphql"},'query {\n PostReadModel(id: "95ddb544-4a60-439f-a0e4-c57e806f2f6e") {\n id\n title\n author\n }\n}\n')),(0,o.kt)("p",null,"You should get a response similar to this:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-json"},'{\n "data": {\n "PostReadModel": {\n "id": "95ddb544-4a60-439f-a0e4-c57e806f2f6e",\n "title": "Build a blog in 10 minutes with Booster",\n "author": "Boosted developer"\n }\n }\n}\n')),(0,o.kt)("h3",{id:"8-removing-the-stack"},"8. Removing the stack"),(0,o.kt)("p",null,"It is convenient to destroy all the infrastructure created after you stop using\nit to avoid generating cloud resource costs. Execute the following command from\nthe root of the project. For safety reasons, you have to confirm this action by\nwriting the project's name, in our case ",(0,o.kt)("inlineCode",{parentName:"p"},"boosted-blog")," that is the same used when\nwe run ",(0,o.kt)("inlineCode",{parentName:"p"},"new:project")," CLI command."),(0,o.kt)(r.Z,{mdxType:"TerminalWindow"},(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"> boost nuke -e production\n\n? Please, enter the app name to confirm deletion of all resources: boosted-blog\n"))),(0,o.kt)("blockquote",null,(0,o.kt)("p",{parentName:"blockquote"},"Congratulations! You've built a serverless backend in less than 10 minutes. We hope you have enjoyed discovering the magic of the Booster Framework.")),(0,o.kt)("h3",{id:"9-more-functionalities"},"9. More functionalities"),(0,o.kt)("p",null,"This is a really basic example of a Booster application. The are many other features Booster provides like:"),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},"Use a more complex authorization schema for commands and read models based on user roles"),(0,o.kt)("li",{parentName:"ul"},"Use GraphQL subscriptions to get updates in real-time"),(0,o.kt)("li",{parentName:"ul"},"Make events trigger other events"),(0,o.kt)("li",{parentName:"ul"},"Deploy static content"),(0,o.kt)("li",{parentName:"ul"},"Reading entities within command handlers to apply domain-driven decisions"),(0,o.kt)("li",{parentName:"ul"},"And much more...")),(0,o.kt)("p",null,"Continue reading to dig more. You've just scratched the surface of all the Booster\ncapabilities!"),(0,o.kt)("h2",{id:"examples-and-walkthroughs"},"Examples and walkthroughs"),(0,o.kt)("h3",{id:"creation-of-a-question-asking-application-backend"},"Creation of a question-asking application backend"),(0,o.kt)("p",null,"In the following video, you will find how to create a backend for a question-asking application from scratch. This application would allow\nusers to create questions and like them. This video goes from creating the project to incrementally deploying features in the application.\nYou can find the code both for the frontend and the backend in ",(0,o.kt)(i.do,{mdxType:"CLAskMeRepo"},(0,o.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/examples/tree/master/askme"},"this GitHub repo")),"."),(0,o.kt)("div",{align:"center"},(0,o.kt)("iframe",{width:"560",height:"315",src:"https://www.youtube.com/embed/C4K2M-orT8k",title:"YouTube video player",frameBorder:"0",allow:"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture",allowFullScreen:!0})),(0,o.kt)("h3",{id:"all-the-guides-and-examples"},"All the guides and examples"),(0,o.kt)("p",null,"Check out the ",(0,o.kt)(i.dM,{mdxType:"CLExampleApps"},(0,o.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/examples"},"example apps repository"))," to see Booster in use."))}m.isMDXComponent=!0},2822:(e,t,n)=>{n.d(t,{Z:()=>a});const a=n.p+"assets/images/aws-resources-e620ed48140a022aae2ca68d0c52b496.png"}}]); \ No newline at end of file diff --git a/assets/js/46b77955.56cc2c75.js b/assets/js/46b77955.a2834cba.js similarity index 85% rename from assets/js/46b77955.56cc2c75.js rename to assets/js/46b77955.a2834cba.js index e78eee210..52aac50f1 100644 --- a/assets/js/46b77955.56cc2c75.js +++ b/assets/js/46b77955.a2834cba.js @@ -1 +1 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[9284],{3905:(e,t,r)=>{r.d(t,{Zo:()=>u,kt:()=>d});var n=r(7294);function o(e,t,r){return t in e?Object.defineProperty(e,t,{value:r,enumerable:!0,configurable:!0,writable:!0}):e[t]=r,e}function a(e,t){var r=Object.keys(e);if(Object.getOwnPropertySymbols){var n=Object.getOwnPropertySymbols(e);t&&(n=n.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),r.push.apply(r,n)}return r}function i(e){for(var t=1;t=0||(o[r]=e[r]);return o}(e,t);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);for(n=0;n=0||Object.prototype.propertyIsEnumerable.call(e,r)&&(o[r]=e[r])}return o}var c=n.createContext({}),l=function(e){var t=n.useContext(c),r=t;return e&&(r="function"==typeof e?e(t):i(i({},t),e)),r},u=function(e){var t=l(e.components);return n.createElement(c.Provider,{value:t},e.children)},p={inlineCode:"code",wrapper:function(e){var t=e.children;return n.createElement(n.Fragment,{},t)}},f=n.forwardRef((function(e,t){var r=e.components,o=e.mdxType,a=e.originalType,c=e.parentName,u=s(e,["components","mdxType","originalType","parentName"]),f=l(r),d=o,m=f["".concat(c,".").concat(d)]||f[d]||p[d]||a;return r?n.createElement(m,i(i({ref:t},u),{},{components:r})):n.createElement(m,i({ref:t},u))}));function d(e,t){var r=arguments,o=t&&t.mdxType;if("string"==typeof e||o){var a=r.length,i=new Array(a);i[0]=f;var s={};for(var c in t)hasOwnProperty.call(t,c)&&(s[c]=t[c]);s.originalType=e,s.mdxType="string"==typeof e?e:o,i[1]=s;for(var l=2;l{r.r(t),r.d(t,{assets:()=>l,contentTitle:()=>s,default:()=>f,frontMatter:()=>i,metadata:()=>c,toc:()=>u});var n=r(7462),o=(r(7294),r(3905)),a=r(999);const i={slug:"/"},s="Ask about Booster Framework",c={unversionedId:"ai-assistant",id:"ai-assistant",title:"Ask about Booster Framework",description:"",source:"@site/docs/00_ai-assistant.md",sourceDirName:".",slug:"/",permalink:"/",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/00_ai-assistant.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",sidebarPosition:0,frontMatter:{slug:"/"},sidebar:"docs",next:{title:"Introduction",permalink:"/introduction"}},l={},u=[],p={toc:u};function f(e){let{components:t,...r}=e;return(0,o.kt)("wrapper",(0,n.Z)({},p,r,{components:t,mdxType:"MDXLayout"}),(0,o.kt)("h1",{id:"ask-about-booster-framework"},"Ask about Booster Framework"),(0,o.kt)(a.ZP,{mdxType:"BoosterChat"}))}f.isMDXComponent=!0}}]); \ No newline at end of file +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[9284],{3905:(e,t,r)=>{r.d(t,{Zo:()=>u,kt:()=>d});var n=r(7294);function o(e,t,r){return t in e?Object.defineProperty(e,t,{value:r,enumerable:!0,configurable:!0,writable:!0}):e[t]=r,e}function a(e,t){var r=Object.keys(e);if(Object.getOwnPropertySymbols){var n=Object.getOwnPropertySymbols(e);t&&(n=n.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),r.push.apply(r,n)}return r}function i(e){for(var t=1;t=0||(o[r]=e[r]);return o}(e,t);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);for(n=0;n=0||Object.prototype.propertyIsEnumerable.call(e,r)&&(o[r]=e[r])}return o}var c=n.createContext({}),l=function(e){var t=n.useContext(c),r=t;return e&&(r="function"==typeof e?e(t):i(i({},t),e)),r},u=function(e){var t=l(e.components);return n.createElement(c.Provider,{value:t},e.children)},p={inlineCode:"code",wrapper:function(e){var t=e.children;return n.createElement(n.Fragment,{},t)}},f=n.forwardRef((function(e,t){var r=e.components,o=e.mdxType,a=e.originalType,c=e.parentName,u=s(e,["components","mdxType","originalType","parentName"]),f=l(r),d=o,m=f["".concat(c,".").concat(d)]||f[d]||p[d]||a;return r?n.createElement(m,i(i({ref:t},u),{},{components:r})):n.createElement(m,i({ref:t},u))}));function d(e,t){var r=arguments,o=t&&t.mdxType;if("string"==typeof e||o){var a=r.length,i=new Array(a);i[0]=f;var s={};for(var c in t)hasOwnProperty.call(t,c)&&(s[c]=t[c]);s.originalType=e,s.mdxType="string"==typeof e?e:o,i[1]=s;for(var l=2;l{r.r(t),r.d(t,{assets:()=>l,contentTitle:()=>s,default:()=>f,frontMatter:()=>i,metadata:()=>c,toc:()=>u});var n=r(7462),o=(r(7294),r(3905)),a=r(999);const i={slug:"/"},s="Ask about Booster Framework",c={unversionedId:"ai-assistant",id:"ai-assistant",title:"Ask about Booster Framework",description:"",source:"@site/docs/00_ai-assistant.md",sourceDirName:".",slug:"/",permalink:"/",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/00_ai-assistant.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",sidebarPosition:0,frontMatter:{slug:"/"},sidebar:"docs",next:{title:"Introduction",permalink:"/introduction"}},l={},u=[],p={toc:u};function f(e){let{components:t,...r}=e;return(0,o.kt)("wrapper",(0,n.Z)({},p,r,{components:t,mdxType:"MDXLayout"}),(0,o.kt)("h1",{id:"ask-about-booster-framework"},"Ask about Booster Framework"),(0,o.kt)(a.ZP,{mdxType:"BoosterChat"}))}f.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/4da0bd64.d7e9cb9e.js b/assets/js/4da0bd64.d7e9cb9e.js deleted file mode 100644 index dca0f7578..000000000 --- a/assets/js/4da0bd64.d7e9cb9e.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[6038],{3905:(e,t,n)=>{n.d(t,{Zo:()=>p,kt:()=>m});var r=n(7294);function o(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function a(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var r=Object.getOwnPropertySymbols(e);t&&(r=r.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,r)}return n}function i(e){for(var t=1;t=0||(o[n]=e[n]);return o}(e,t);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);for(r=0;r=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(o[n]=e[n])}return o}var s=r.createContext({}),l=function(e){var t=r.useContext(s),n=t;return e&&(n="function"==typeof e?e(t):i(i({},t),e)),n},p=function(e){var t=l(e.components);return r.createElement(s.Provider,{value:t},e.children)},d={inlineCode:"code",wrapper:function(e){var t=e.children;return r.createElement(r.Fragment,{},t)}},u=r.forwardRef((function(e,t){var n=e.components,o=e.mdxType,a=e.originalType,s=e.parentName,p=c(e,["components","mdxType","originalType","parentName"]),u=l(n),m=o,f=u["".concat(s,".").concat(m)]||u[m]||d[m]||a;return n?r.createElement(f,i(i({ref:t},p),{},{components:n})):r.createElement(f,i({ref:t},p))}));function m(e,t){var n=arguments,o=t&&t.mdxType;if("string"==typeof e||o){var a=n.length,i=new Array(a);i[0]=u;var c={};for(var s in t)hasOwnProperty.call(t,s)&&(c[s]=t[s]);c.originalType=e,c.mdxType="string"==typeof e?e:o,i[1]=c;for(var l=2;l{n.r(t),n.d(t,{assets:()=>s,contentTitle:()=>i,default:()=>d,frontMatter:()=>a,metadata:()=>c,toc:()=>l});var r=n(7462),o=(n(7294),n(3905));const a={},i="Booster instrumentation",c={unversionedId:"going-deeper/instrumentation",id:"going-deeper/instrumentation",title:"Booster instrumentation",description:"Trace Decorator",source:"@site/docs/10_going-deeper/instrumentation.md",sourceDirName:"10_going-deeper",slug:"/going-deeper/instrumentation",permalink:"/going-deeper/instrumentation",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/10_going-deeper/instrumentation.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",frontMatter:{},sidebar:"docs",previous:{title:"Framework packages",permalink:"/going-deeper/framework-packages"},next:{title:"Frequently Asked Questions",permalink:"/frequently-asked-questions"}},s={},l=[{value:"Trace Decorator",id:"trace-decorator",level:2},{value:"Usage",id:"usage",level:3},{value:"TraceActionTypes",id:"traceactiontypes",level:3},{value:"TraceInfo",id:"traceinfo",level:3},{value:"Adding the Trace Decorator to Your own async methods",id:"adding-the-trace-decorator-to-your-own-async-methods",level:3}],p={toc:l};function d(e){let{components:t,...n}=e;return(0,o.kt)("wrapper",(0,r.Z)({},p,n,{components:t,mdxType:"MDXLayout"}),(0,o.kt)("h1",{id:"booster-instrumentation"},"Booster instrumentation"),(0,o.kt)("h2",{id:"trace-decorator"},"Trace Decorator"),(0,o.kt)("p",null,"The Trace Decorator is a ",(0,o.kt)("strong",{parentName:"p"},"Booster")," functionality that facilitates the reception of notifications whenever significant events occur in Booster's core, such as event dispatching or migration execution."),(0,o.kt)("h3",{id:"usage"},"Usage"),(0,o.kt)("p",null,"To configure a custom tracer, you need to define an object with two methods: onStart and onEnd. The onStart method is called before the traced method is invoked, and the onEnd method is called after the method completes. Both methods receive a TraceInfo object, which contains information about the traced method and its arguments."),(0,o.kt)("p",null,"Here's an example of a custom tracer that logs trace events to the console:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"import {\n TraceParameters,\n BoosterConfig,\n TraceActionTypes,\n} from '@boostercloud/framework-types'\n\nclass MyTracer {\n static async onStart(config: BoosterConfig, actionType: string, traceParameters: TraceParameters): Promise {\n console.log(`Start ${actionType}: ${traceParameters.className}.${traceParameters.methodName}`)\n }\n\n static async onEnd(config: BoosterConfig, actionType: string, traceParameters: TraceParameters): Promise {\n console.log(`End ${actionType}: ${traceParameters.className}.${traceParameters.methodName}`)\n }\n}\n")),(0,o.kt)("p",null,"You can then configure the tracer in your Booster application's configuration:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"import { BoosterConfig } from '@boostercloud/framework-types'\nimport { MyTracer } from './my-tracer'\n\nconst config: BoosterConfig = {\n// ...other configuration options...\n trace: {\n enableTraceNotification: true,\n onStart: MyTracer.onStart,\n onEnd: MyTracer.onStart,\n }\n}\n")),(0,o.kt)("p",null,"In the configuration above, we've enabled trace notifications and specified our onStart and onEnd as the methods to use. Verbose disable will reduce the amount of information generated excluding the internal parameter in the trace parameters. "),(0,o.kt)("p",null,"Setting ",(0,o.kt)("inlineCode",{parentName:"p"},"enableTraceNotification: true")," would enable the trace for all actions. You can either disable them by setting it to ",(0,o.kt)("inlineCode",{parentName:"p"},"false")," or selectively enable only specific actions using an array of TraceActionTypes."),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"import { BoosterConfig, TraceActionTypes } from '@boostercloud/framework-types'\nimport { MyTracer } from './my-tracer'\n\nconst config: BoosterConfig = {\n// ...other configuration options...\n trace: {\n enableTraceNotification: [TraceActionTypes.DISPATCH_EVENT, TraceActionTypes.MIGRATION_RUN, 'OTHER'],\n includeInternal: false,\n onStart: MyTracer.onStart,\n onEnd: MyTracer.onStart,\n }\n}\n")),(0,o.kt)("p",null,"In this example, only DISPATCH_EVENT, MIGRATION_RUN and 'OTHER' actions will trigger trace notifications."),(0,o.kt)("h3",{id:"traceactiontypes"},"TraceActionTypes"),(0,o.kt)("p",null,"The TraceActionTypes enum defines all the traceable actions in Booster's core:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"export enum TraceActionTypes {\n CUSTOM,\n EVENT_HANDLERS_PROCESS,\n HANDLE_EVENT,\n DISPATCH_ENTITY_TO_EVENT_HANDLERS,\n DISPATCH_EVENTS,\n FETCH_ENTITY_SNAPSHOT,\n STORE_SNAPSHOT,\n LOAD_LATEST_SNAPSHOT,\n LOAD_EVENT_STREAM_SINCE,\n ENTITY_REDUCER,\n READ_MODEL_FIND_BY_ID,\n GRAPHQL_READ_MODEL_SEARCH,\n READ_MODEL_SEARCH,\n COMMAND_HANDLER,\n MIGRATION_RUN,\n GRAPHQL_DISPATCH,\n GRAPHQL_RUN_OPERATION,\n SCHEDULED_COMMAND_HANDLER,\n DISPATCH_SUBSCRIBER_NOTIFIER,\n READ_MODEL_SCHEMA_MIGRATOR_RUN,\n SCHEMA_MIGRATOR_MIGRATE,\n}\n")),(0,o.kt)("h3",{id:"traceinfo"},"TraceInfo"),(0,o.kt)("p",null,"The TraceInfo interface defines the data that is passed to the tracer's onBefore and onAfter methods:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"export interface TraceInfo {\n className: string\n methodName: string\n args: Array\n traceId: UUID\n elapsedInvocationMillis?: number\n internal: {\n target: unknown\n descriptor: PropertyDescriptor\n }\n description?: string\n}\n")),(0,o.kt)("p",null,(0,o.kt)("inlineCode",{parentName:"p"},"className")," and ",(0,o.kt)("inlineCode",{parentName:"p"},"methodName")," identify the function that is being traced."),(0,o.kt)("h3",{id:"adding-the-trace-decorator-to-your-own-async-methods"},"Adding the Trace Decorator to Your own async methods"),(0,o.kt)("p",null,"In addition to using the Trace Decorator to receive notifications when events occur in Booster's core, you can also use it to trace your own methods. To add the Trace Decorator to your own methods, simply add @Trace() before your method declaration."),(0,o.kt)("p",null,"Here's an example of how to use the Trace Decorator on a custom method:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"import { Trace } from '@boostercloud/framework-core'\nimport { BoosterConfig, Logger } from '@boostercloud/framework-types'\n\nexport class MyCustomClass {\n @Trace('OTHER')\n public async myCustomMethod(config: BoosterConfig, logger: Logger): Promise {\n logger.debug('This is my custom method')\n // Do some custom logic here...\n }\n}\n")),(0,o.kt)("p",null,"In the example above, we added the @Trace('OTHER') decorator to the myCustomMethod method. This will cause the method to emit trace events when it's invoked, allowing you to trace the flow of your application and detect performance bottlenecks or errors."),(0,o.kt)("p",null,"Note that when you add the Trace Decorator to your own methods, you'll need to configure your Booster instance to use a tracer that implements the necessary methods to handle these events."))}d.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/4da0bd64.fe966a5a.js b/assets/js/4da0bd64.fe966a5a.js new file mode 100644 index 000000000..3ca30a4d7 --- /dev/null +++ b/assets/js/4da0bd64.fe966a5a.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[6038],{3905:(e,t,n)=>{n.d(t,{Zo:()=>p,kt:()=>m});var r=n(7294);function o(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function a(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var r=Object.getOwnPropertySymbols(e);t&&(r=r.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,r)}return n}function i(e){for(var t=1;t=0||(o[n]=e[n]);return o}(e,t);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);for(r=0;r=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(o[n]=e[n])}return o}var s=r.createContext({}),l=function(e){var t=r.useContext(s),n=t;return e&&(n="function"==typeof e?e(t):i(i({},t),e)),n},p=function(e){var t=l(e.components);return r.createElement(s.Provider,{value:t},e.children)},d={inlineCode:"code",wrapper:function(e){var t=e.children;return r.createElement(r.Fragment,{},t)}},u=r.forwardRef((function(e,t){var n=e.components,o=e.mdxType,a=e.originalType,s=e.parentName,p=c(e,["components","mdxType","originalType","parentName"]),u=l(n),m=o,f=u["".concat(s,".").concat(m)]||u[m]||d[m]||a;return n?r.createElement(f,i(i({ref:t},p),{},{components:n})):r.createElement(f,i({ref:t},p))}));function m(e,t){var n=arguments,o=t&&t.mdxType;if("string"==typeof e||o){var a=n.length,i=new Array(a);i[0]=u;var c={};for(var s in t)hasOwnProperty.call(t,s)&&(c[s]=t[s]);c.originalType=e,c.mdxType="string"==typeof e?e:o,i[1]=c;for(var l=2;l{n.r(t),n.d(t,{assets:()=>s,contentTitle:()=>i,default:()=>d,frontMatter:()=>a,metadata:()=>c,toc:()=>l});var r=n(7462),o=(n(7294),n(3905));const a={},i="Booster instrumentation",c={unversionedId:"going-deeper/instrumentation",id:"going-deeper/instrumentation",title:"Booster instrumentation",description:"Trace Decorator",source:"@site/docs/10_going-deeper/instrumentation.md",sourceDirName:"10_going-deeper",slug:"/going-deeper/instrumentation",permalink:"/going-deeper/instrumentation",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/10_going-deeper/instrumentation.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",frontMatter:{},sidebar:"docs",previous:{title:"Framework packages",permalink:"/going-deeper/framework-packages"},next:{title:"Frequently Asked Questions",permalink:"/frequently-asked-questions"}},s={},l=[{value:"Trace Decorator",id:"trace-decorator",level:2},{value:"Usage",id:"usage",level:3},{value:"TraceActionTypes",id:"traceactiontypes",level:3},{value:"TraceInfo",id:"traceinfo",level:3},{value:"Adding the Trace Decorator to Your own async methods",id:"adding-the-trace-decorator-to-your-own-async-methods",level:3}],p={toc:l};function d(e){let{components:t,...n}=e;return(0,o.kt)("wrapper",(0,r.Z)({},p,n,{components:t,mdxType:"MDXLayout"}),(0,o.kt)("h1",{id:"booster-instrumentation"},"Booster instrumentation"),(0,o.kt)("h2",{id:"trace-decorator"},"Trace Decorator"),(0,o.kt)("p",null,"The Trace Decorator is a ",(0,o.kt)("strong",{parentName:"p"},"Booster")," functionality that facilitates the reception of notifications whenever significant events occur in Booster's core, such as event dispatching or migration execution."),(0,o.kt)("h3",{id:"usage"},"Usage"),(0,o.kt)("p",null,"To configure a custom tracer, you need to define an object with two methods: onStart and onEnd. The onStart method is called before the traced method is invoked, and the onEnd method is called after the method completes. Both methods receive a TraceInfo object, which contains information about the traced method and its arguments."),(0,o.kt)("p",null,"Here's an example of a custom tracer that logs trace events to the console:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"import {\n TraceParameters,\n BoosterConfig,\n TraceActionTypes,\n} from '@boostercloud/framework-types'\n\nclass MyTracer {\n static async onStart(config: BoosterConfig, actionType: string, traceParameters: TraceParameters): Promise {\n console.log(`Start ${actionType}: ${traceParameters.className}.${traceParameters.methodName}`)\n }\n\n static async onEnd(config: BoosterConfig, actionType: string, traceParameters: TraceParameters): Promise {\n console.log(`End ${actionType}: ${traceParameters.className}.${traceParameters.methodName}`)\n }\n}\n")),(0,o.kt)("p",null,"You can then configure the tracer in your Booster application's configuration:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"import { BoosterConfig } from '@boostercloud/framework-types'\nimport { MyTracer } from './my-tracer'\n\nconst config: BoosterConfig = {\n// ...other configuration options...\n trace: {\n enableTraceNotification: true,\n onStart: MyTracer.onStart,\n onEnd: MyTracer.onStart,\n }\n}\n")),(0,o.kt)("p",null,"In the configuration above, we've enabled trace notifications and specified our onStart and onEnd as the methods to use. Verbose disable will reduce the amount of information generated excluding the internal parameter in the trace parameters. "),(0,o.kt)("p",null,"Setting ",(0,o.kt)("inlineCode",{parentName:"p"},"enableTraceNotification: true")," would enable the trace for all actions. You can either disable them by setting it to ",(0,o.kt)("inlineCode",{parentName:"p"},"false")," or selectively enable only specific actions using an array of TraceActionTypes."),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"import { BoosterConfig, TraceActionTypes } from '@boostercloud/framework-types'\nimport { MyTracer } from './my-tracer'\n\nconst config: BoosterConfig = {\n// ...other configuration options...\n trace: {\n enableTraceNotification: [TraceActionTypes.DISPATCH_EVENT, TraceActionTypes.MIGRATION_RUN, 'OTHER'],\n includeInternal: false,\n onStart: MyTracer.onStart,\n onEnd: MyTracer.onStart,\n }\n}\n")),(0,o.kt)("p",null,"In this example, only DISPATCH_EVENT, MIGRATION_RUN and 'OTHER' actions will trigger trace notifications."),(0,o.kt)("h3",{id:"traceactiontypes"},"TraceActionTypes"),(0,o.kt)("p",null,"The TraceActionTypes enum defines all the traceable actions in Booster's core:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"export enum TraceActionTypes {\n CUSTOM,\n EVENT_HANDLERS_PROCESS,\n HANDLE_EVENT,\n DISPATCH_ENTITY_TO_EVENT_HANDLERS,\n DISPATCH_EVENTS,\n FETCH_ENTITY_SNAPSHOT,\n STORE_SNAPSHOT,\n LOAD_LATEST_SNAPSHOT,\n LOAD_EVENT_STREAM_SINCE,\n ENTITY_REDUCER,\n READ_MODEL_FIND_BY_ID,\n GRAPHQL_READ_MODEL_SEARCH,\n READ_MODEL_SEARCH,\n COMMAND_HANDLER,\n MIGRATION_RUN,\n GRAPHQL_DISPATCH,\n GRAPHQL_RUN_OPERATION,\n SCHEDULED_COMMAND_HANDLER,\n DISPATCH_SUBSCRIBER_NOTIFIER,\n READ_MODEL_SCHEMA_MIGRATOR_RUN,\n SCHEMA_MIGRATOR_MIGRATE,\n}\n")),(0,o.kt)("h3",{id:"traceinfo"},"TraceInfo"),(0,o.kt)("p",null,"The TraceInfo interface defines the data that is passed to the tracer's onBefore and onAfter methods:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"export interface TraceInfo {\n className: string\n methodName: string\n args: Array\n traceId: UUID\n elapsedInvocationMillis?: number\n internal: {\n target: unknown\n descriptor: PropertyDescriptor\n }\n description?: string\n}\n")),(0,o.kt)("p",null,(0,o.kt)("inlineCode",{parentName:"p"},"className")," and ",(0,o.kt)("inlineCode",{parentName:"p"},"methodName")," identify the function that is being traced."),(0,o.kt)("h3",{id:"adding-the-trace-decorator-to-your-own-async-methods"},"Adding the Trace Decorator to Your own async methods"),(0,o.kt)("p",null,"In addition to using the Trace Decorator to receive notifications when events occur in Booster's core, you can also use it to trace your own methods. To add the Trace Decorator to your own methods, simply add @Trace() before your method declaration."),(0,o.kt)("p",null,"Here's an example of how to use the Trace Decorator on a custom method:"),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-typescript"},"import { Trace } from '@boostercloud/framework-core'\nimport { BoosterConfig, Logger } from '@boostercloud/framework-types'\n\nexport class MyCustomClass {\n @Trace('OTHER')\n public async myCustomMethod(config: BoosterConfig, logger: Logger): Promise {\n logger.debug('This is my custom method')\n // Do some custom logic here...\n }\n}\n")),(0,o.kt)("p",null,"In the example above, we added the @Trace('OTHER') decorator to the myCustomMethod method. This will cause the method to emit trace events when it's invoked, allowing you to trace the flow of your application and detect performance bottlenecks or errors."),(0,o.kt)("p",null,"Note that when you add the Trace Decorator to your own methods, you'll need to configure your Booster instance to use a tracer that implements the necessary methods to handle these events."))}d.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/55aa456f.7acb874a.js b/assets/js/55aa456f.7acb874a.js deleted file mode 100644 index 5415a3b2f..000000000 --- a/assets/js/55aa456f.7acb874a.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[695],{3905:(e,t,n)=>{n.d(t,{Zo:()=>c,kt:()=>u});var a=n(7294);function r(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function i(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);t&&(a=a.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,a)}return n}function l(e){for(var t=1;t=0||(r[n]=e[n]);return r}(e,t);if(Object.getOwnPropertySymbols){var i=Object.getOwnPropertySymbols(e);for(a=0;a=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(r[n]=e[n])}return r}var s=a.createContext({}),d=function(e){var t=a.useContext(s),n=t;return e&&(n="function"==typeof e?e(t):l(l({},t),e)),n},c=function(e){var t=d(e.components);return a.createElement(s.Provider,{value:t},e.children)},h={inlineCode:"code",wrapper:function(e){var t=e.children;return a.createElement(a.Fragment,{},t)}},p=a.forwardRef((function(e,t){var n=e.components,r=e.mdxType,i=e.originalType,s=e.parentName,c=o(e,["components","mdxType","originalType","parentName"]),p=d(n),u=r,v=p["".concat(s,".").concat(u)]||p[u]||h[u]||i;return n?a.createElement(v,l(l({ref:t},c),{},{components:n})):a.createElement(v,l({ref:t},c))}));function u(e,t){var n=arguments,r=t&&t.mdxType;if("string"==typeof e||r){var i=n.length,l=new Array(i);l[0]=p;var o={};for(var s in t)hasOwnProperty.call(t,s)&&(o[s]=t[s]);o.originalType=e,o.mdxType="string"==typeof e?e:r,l[1]=o;for(var d=2;d{n.d(t,{Z:()=>d});var a=n(7294);const r="terminalWindow_wGrl",i="terminalWindowHeader_o9Cs",l="buttons_IGLB",o="dot_fGZE",s="terminalWindowBody_tzdS";function d(e){let{children:t}=e;return a.createElement("div",{className:r},a.createElement("div",{className:i},a.createElement("div",{className:l},a.createElement("span",{className:o,style:{background:"#f25f58"}}),a.createElement("span",{className:o,style:{background:"#fbbe3c"}}),a.createElement("span",{className:o,style:{background:"#58cb42"}}))),a.createElement("div",{className:s},t))}},5268:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>d,contentTitle:()=>o,default:()=>p,frontMatter:()=>l,metadata:()=>s,toc:()=>c});var a=n(7462),r=(n(7294),n(3905)),i=n(5163);const l={description:"Learn how to react to events and trigger side effects in Booster by defining event handlers."},o="Event handler",s={unversionedId:"architecture/event-handler",id:"architecture/event-handler",title:"Event handler",description:"Learn how to react to events and trigger side effects in Booster by defining event handlers.",source:"@site/docs/03_architecture/04_event-handler.mdx",sourceDirName:"03_architecture",slug:"/architecture/event-handler",permalink:"/architecture/event-handler",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/03_architecture/04_event-handler.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",sidebarPosition:4,frontMatter:{description:"Learn how to react to events and trigger side effects in Booster by defining event handlers."},sidebar:"docs",previous:{title:"Event",permalink:"/architecture/event"},next:{title:"Entity",permalink:"/architecture/entity"}},d={},c=[{value:"Creating an event handler",id:"creating-an-event-handler",level:2},{value:"Declaring an event handler",id:"declaring-an-event-handler",level:2},{value:"Creating an event handler",id:"creating-an-event-handler-1",level:2},{value:"Registering events from an event handler",id:"registering-events-from-an-event-handler",level:2},{value:"Reading entities from event handlers",id:"reading-entities-from-event-handlers",level:2}],h={toc:c};function p(e){let{components:t,...n}=e;return(0,r.kt)("wrapper",(0,a.Z)({},h,n,{components:t,mdxType:"MDXLayout"}),(0,r.kt)("h1",{id:"event-handler"},"Event handler"),(0,r.kt)("p",null,"An event handler is a class that reacts to events. They are commonly used to trigger side effects in case of a new event. For instance, if a new event is registered in the system, an event handler could send an email to the user."),(0,r.kt)("h2",{id:"creating-an-event-handler"},"Creating an event handler"),(0,r.kt)("p",null,"The Booster CLI will help you to create new event handlers. You just need to run the following command and the CLI will generate all the boilerplate for you:"),(0,r.kt)(i.Z,{mdxType:"TerminalWindow"},(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-shell"},"boost new:event-handler HandleAvailability --event StockMoved\n"))),(0,r.kt)("p",null,"This will generate a new file called ",(0,r.kt)("inlineCode",{parentName:"p"},"handle-availability.ts")," in the ",(0,r.kt)("inlineCode",{parentName:"p"},"src/event-handlers")," directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI."),(0,r.kt)("h2",{id:"declaring-an-event-handler"},"Declaring an event handler"),(0,r.kt)("p",null,"In Booster, event handlers are classes decorated with the ",(0,r.kt)("inlineCode",{parentName:"p"},"@EventHandler")," decorator. The parameter of the decorator is the event that the handler will react to. The logic to be triggered after an event is registered is defined in the ",(0,r.kt)("inlineCode",{parentName:"p"},"handle")," method of the class. This ",(0,r.kt)("inlineCode",{parentName:"p"},"handle")," function will receive the event that triggered the handler."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/event-handlers/handle-availability.ts"',title:'"src/event-handlers/handle-availability.ts"'},"// highlight-next-line\n@EventHandler(StockMoved)\nexport class HandleAvailability {\n // highlight-start\n public static async handle(event: StockMoved): Promise {\n // Do something here\n }\n // highlight-end\n}\n")),(0,r.kt)("h2",{id:"creating-an-event-handler-1"},"Creating an event handler"),(0,r.kt)("p",null,"Event handlers can be easily created using the Booster CLI command ",(0,r.kt)("inlineCode",{parentName:"p"},"boost new:event-handler"),". There are two mandatory arguments: the event handler name, and the name of the event it will react to. For instance:"),(0,r.kt)(i.Z,{mdxType:"TerminalWindow"},(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript"},"boost new:event-handler HandleAvailability --event StockMoved\n"))),(0,r.kt)("p",null,"Once the creation is completed, there will be a new file in the event handlers directory ",(0,r.kt)("inlineCode",{parentName:"p"},"/src/event-handlers/handle-availability.ts"),"."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-text"},"\n\u251c\u2500\u2500 src\n\u2502 \u251c\u2500\u2500 commands\n\u2502 \u251c\u2500\u2500 common\n\u2502 \u251c\u2500\u2500 config\n\u2502 \u251c\u2500\u2500 entities\n\u2502 \u251c\u2500\u2500 events\n\u2502 \u251c\u2500\u2500 event-handlers <------ put them here\n\u2502 \u2514\u2500\u2500 read-models\n")),(0,r.kt)("h2",{id:"registering-events-from-an-event-handler"},"Registering events from an event handler"),(0,r.kt)("p",null,"Event handlers can also register new events. This is useful when you want to trigger a new event after a certain condition is met. For example, if you want to send an email to the user when a product is out of stock."),(0,r.kt)("p",null,"In order to register new events, Booster injects the ",(0,r.kt)("inlineCode",{parentName:"p"},"register")," instance in the ",(0,r.kt)("inlineCode",{parentName:"p"},"handle")," method as a second parameter. This ",(0,r.kt)("inlineCode",{parentName:"p"},"register")," instance has a ",(0,r.kt)("inlineCode",{parentName:"p"},"events(...)")," method that allows you to store any side effect events, you can specify as many as you need separated by commas as arguments of the function."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/event-handlers/handle-availability.ts"',title:'"src/event-handlers/handle-availability.ts"'},"@EventHandler(StockMoved)\nexport class HandleAvailability {\n public static async handle(event: StockMoved, register: Register): Promise {\n if (event.quantity < 0) {\n // highlight-next-line\n register.events([new ProductOutOfStock(event.productID)])\n }\n }\n}\n")),(0,r.kt)("h2",{id:"reading-entities-from-event-handlers"},"Reading entities from event handlers"),(0,r.kt)("p",null,"There are cases where you need to read an entity to make a decision based on its current state. Different side effects can be triggered depending on the current state of the entity. Given the previous example, if a user does not want to receive emails when a product is out of stock, we should be able check the user preferences before sending the email."),(0,r.kt)("p",null,"For that reason, Booster provides the ",(0,r.kt)("inlineCode",{parentName:"p"},"Booster.entity")," function. This function allows you to retrieve the current state of an entity. Let's say that we want to check the status of a product before we trigger its availability update. In that case we would call the ",(0,r.kt)("inlineCode",{parentName:"p"},"Booster.entity")," function, which will return information about the entity."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/event-handlers/handle-availability.ts"',title:'"src/event-handlers/handle-availability.ts"'},"@EventHandler(StockMoved)\nexport class HandleAvailability {\n public static async handle(event: StockMoved, register: Register): Promise {\n // highlight-next-line\n const product = await Booster.entity(Product, event.productID)\n if (product.stock < 0) {\n register.events([new ProductOutOfStock(event.productID)])\n }\n }\n}\n")))}p.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/55aa456f.9ce1e5fe.js b/assets/js/55aa456f.9ce1e5fe.js new file mode 100644 index 000000000..06fa079bb --- /dev/null +++ b/assets/js/55aa456f.9ce1e5fe.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[695],{3905:(e,t,n)=>{n.d(t,{Zo:()=>c,kt:()=>u});var a=n(7294);function r(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function i(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);t&&(a=a.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,a)}return n}function l(e){for(var t=1;t=0||(r[n]=e[n]);return r}(e,t);if(Object.getOwnPropertySymbols){var i=Object.getOwnPropertySymbols(e);for(a=0;a=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(r[n]=e[n])}return r}var s=a.createContext({}),d=function(e){var t=a.useContext(s),n=t;return e&&(n="function"==typeof e?e(t):l(l({},t),e)),n},c=function(e){var t=d(e.components);return a.createElement(s.Provider,{value:t},e.children)},h={inlineCode:"code",wrapper:function(e){var t=e.children;return a.createElement(a.Fragment,{},t)}},p=a.forwardRef((function(e,t){var n=e.components,r=e.mdxType,i=e.originalType,s=e.parentName,c=o(e,["components","mdxType","originalType","parentName"]),p=d(n),u=r,v=p["".concat(s,".").concat(u)]||p[u]||h[u]||i;return n?a.createElement(v,l(l({ref:t},c),{},{components:n})):a.createElement(v,l({ref:t},c))}));function u(e,t){var n=arguments,r=t&&t.mdxType;if("string"==typeof e||r){var i=n.length,l=new Array(i);l[0]=p;var o={};for(var s in t)hasOwnProperty.call(t,s)&&(o[s]=t[s]);o.originalType=e,o.mdxType="string"==typeof e?e:r,l[1]=o;for(var d=2;d{n.d(t,{Z:()=>d});var a=n(7294);const r="terminalWindow_wGrl",i="terminalWindowHeader_o9Cs",l="buttons_IGLB",o="dot_fGZE",s="terminalWindowBody_tzdS";function d(e){let{children:t}=e;return a.createElement("div",{className:r},a.createElement("div",{className:i},a.createElement("div",{className:l},a.createElement("span",{className:o,style:{background:"#f25f58"}}),a.createElement("span",{className:o,style:{background:"#fbbe3c"}}),a.createElement("span",{className:o,style:{background:"#58cb42"}}))),a.createElement("div",{className:s},t))}},5268:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>d,contentTitle:()=>o,default:()=>p,frontMatter:()=>l,metadata:()=>s,toc:()=>c});var a=n(7462),r=(n(7294),n(3905)),i=n(5163);const l={description:"Learn how to react to events and trigger side effects in Booster by defining event handlers."},o="Event handler",s={unversionedId:"architecture/event-handler",id:"architecture/event-handler",title:"Event handler",description:"Learn how to react to events and trigger side effects in Booster by defining event handlers.",source:"@site/docs/03_architecture/04_event-handler.mdx",sourceDirName:"03_architecture",slug:"/architecture/event-handler",permalink:"/architecture/event-handler",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/03_architecture/04_event-handler.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",sidebarPosition:4,frontMatter:{description:"Learn how to react to events and trigger side effects in Booster by defining event handlers."},sidebar:"docs",previous:{title:"Event",permalink:"/architecture/event"},next:{title:"Entity",permalink:"/architecture/entity"}},d={},c=[{value:"Creating an event handler",id:"creating-an-event-handler",level:2},{value:"Declaring an event handler",id:"declaring-an-event-handler",level:2},{value:"Creating an event handler",id:"creating-an-event-handler-1",level:2},{value:"Registering events from an event handler",id:"registering-events-from-an-event-handler",level:2},{value:"Reading entities from event handlers",id:"reading-entities-from-event-handlers",level:2}],h={toc:c};function p(e){let{components:t,...n}=e;return(0,r.kt)("wrapper",(0,a.Z)({},h,n,{components:t,mdxType:"MDXLayout"}),(0,r.kt)("h1",{id:"event-handler"},"Event handler"),(0,r.kt)("p",null,"An event handler is a class that reacts to events. They are commonly used to trigger side effects in case of a new event. For instance, if a new event is registered in the system, an event handler could send an email to the user."),(0,r.kt)("h2",{id:"creating-an-event-handler"},"Creating an event handler"),(0,r.kt)("p",null,"The Booster CLI will help you to create new event handlers. You just need to run the following command and the CLI will generate all the boilerplate for you:"),(0,r.kt)(i.Z,{mdxType:"TerminalWindow"},(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-shell"},"boost new:event-handler HandleAvailability --event StockMoved\n"))),(0,r.kt)("p",null,"This will generate a new file called ",(0,r.kt)("inlineCode",{parentName:"p"},"handle-availability.ts")," in the ",(0,r.kt)("inlineCode",{parentName:"p"},"src/event-handlers")," directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI."),(0,r.kt)("h2",{id:"declaring-an-event-handler"},"Declaring an event handler"),(0,r.kt)("p",null,"In Booster, event handlers are classes decorated with the ",(0,r.kt)("inlineCode",{parentName:"p"},"@EventHandler")," decorator. The parameter of the decorator is the event that the handler will react to. The logic to be triggered after an event is registered is defined in the ",(0,r.kt)("inlineCode",{parentName:"p"},"handle")," method of the class. This ",(0,r.kt)("inlineCode",{parentName:"p"},"handle")," function will receive the event that triggered the handler."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/event-handlers/handle-availability.ts"',title:'"src/event-handlers/handle-availability.ts"'},"// highlight-next-line\n@EventHandler(StockMoved)\nexport class HandleAvailability {\n // highlight-start\n public static async handle(event: StockMoved): Promise {\n // Do something here\n }\n // highlight-end\n}\n")),(0,r.kt)("h2",{id:"creating-an-event-handler-1"},"Creating an event handler"),(0,r.kt)("p",null,"Event handlers can be easily created using the Booster CLI command ",(0,r.kt)("inlineCode",{parentName:"p"},"boost new:event-handler"),". There are two mandatory arguments: the event handler name, and the name of the event it will react to. For instance:"),(0,r.kt)(i.Z,{mdxType:"TerminalWindow"},(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript"},"boost new:event-handler HandleAvailability --event StockMoved\n"))),(0,r.kt)("p",null,"Once the creation is completed, there will be a new file in the event handlers directory ",(0,r.kt)("inlineCode",{parentName:"p"},"/src/event-handlers/handle-availability.ts"),"."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-text"},"\n\u251c\u2500\u2500 src\n\u2502 \u251c\u2500\u2500 commands\n\u2502 \u251c\u2500\u2500 common\n\u2502 \u251c\u2500\u2500 config\n\u2502 \u251c\u2500\u2500 entities\n\u2502 \u251c\u2500\u2500 events\n\u2502 \u251c\u2500\u2500 event-handlers <------ put them here\n\u2502 \u2514\u2500\u2500 read-models\n")),(0,r.kt)("h2",{id:"registering-events-from-an-event-handler"},"Registering events from an event handler"),(0,r.kt)("p",null,"Event handlers can also register new events. This is useful when you want to trigger a new event after a certain condition is met. For example, if you want to send an email to the user when a product is out of stock."),(0,r.kt)("p",null,"In order to register new events, Booster injects the ",(0,r.kt)("inlineCode",{parentName:"p"},"register")," instance in the ",(0,r.kt)("inlineCode",{parentName:"p"},"handle")," method as a second parameter. This ",(0,r.kt)("inlineCode",{parentName:"p"},"register")," instance has a ",(0,r.kt)("inlineCode",{parentName:"p"},"events(...)")," method that allows you to store any side effect events, you can specify as many as you need separated by commas as arguments of the function."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/event-handlers/handle-availability.ts"',title:'"src/event-handlers/handle-availability.ts"'},"@EventHandler(StockMoved)\nexport class HandleAvailability {\n public static async handle(event: StockMoved, register: Register): Promise {\n if (event.quantity < 0) {\n // highlight-next-line\n register.events([new ProductOutOfStock(event.productID)])\n }\n }\n}\n")),(0,r.kt)("h2",{id:"reading-entities-from-event-handlers"},"Reading entities from event handlers"),(0,r.kt)("p",null,"There are cases where you need to read an entity to make a decision based on its current state. Different side effects can be triggered depending on the current state of the entity. Given the previous example, if a user does not want to receive emails when a product is out of stock, we should be able check the user preferences before sending the email."),(0,r.kt)("p",null,"For that reason, Booster provides the ",(0,r.kt)("inlineCode",{parentName:"p"},"Booster.entity")," function. This function allows you to retrieve the current state of an entity. Let's say that we want to check the status of a product before we trigger its availability update. In that case we would call the ",(0,r.kt)("inlineCode",{parentName:"p"},"Booster.entity")," function, which will return information about the entity."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/event-handlers/handle-availability.ts"',title:'"src/event-handlers/handle-availability.ts"'},"@EventHandler(StockMoved)\nexport class HandleAvailability {\n public static async handle(event: StockMoved, register: Register): Promise {\n // highlight-next-line\n const product = await Booster.entity(Product, event.productID)\n if (product.stock < 0) {\n register.events([new ProductOutOfStock(event.productID)])\n }\n }\n}\n")))}p.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/5b078add.309aad70.js b/assets/js/5b078add.309aad70.js deleted file mode 100644 index 87274514c..000000000 --- a/assets/js/5b078add.309aad70.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[1422],{3905:(e,t,n)=>{n.d(t,{Zo:()=>m,kt:()=>p});var a=n(7294);function r(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function o(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);t&&(a=a.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,a)}return n}function i(e){for(var t=1;t=0||(r[n]=e[n]);return r}(e,t);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);for(a=0;a=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(r[n]=e[n])}return r}var c=a.createContext({}),l=function(e){var t=a.useContext(c),n=t;return e&&(n="function"==typeof e?e(t):i(i({},t),e)),n},m=function(e){var t=l(e.components);return a.createElement(c.Provider,{value:t},e.children)},d={inlineCode:"code",wrapper:function(e){var t=e.children;return a.createElement(a.Fragment,{},t)}},u=a.forwardRef((function(e,t){var n=e.components,r=e.mdxType,o=e.originalType,c=e.parentName,m=s(e,["components","mdxType","originalType","parentName"]),u=l(n),p=r,h=u["".concat(c,".").concat(p)]||u[p]||d[p]||o;return n?a.createElement(h,i(i({ref:t},m),{},{components:n})):a.createElement(h,i({ref:t},m))}));function p(e,t){var n=arguments,r=t&&t.mdxType;if("string"==typeof e||r){var o=n.length,i=new Array(o);i[0]=u;var s={};for(var c in t)hasOwnProperty.call(t,c)&&(s[c]=t[c]);s.originalType=e,s.mdxType="string"==typeof e?e:r,i[1]=s;for(var l=2;l{n.d(t,{Z:()=>l});var a=n(7294);const r="terminalWindow_wGrl",o="terminalWindowHeader_o9Cs",i="buttons_IGLB",s="dot_fGZE",c="terminalWindowBody_tzdS";function l(e){let{children:t}=e;return a.createElement("div",{className:r},a.createElement("div",{className:o},a.createElement("div",{className:i},a.createElement("span",{className:s,style:{background:"#f25f58"}}),a.createElement("span",{className:s,style:{background:"#fbbe3c"}}),a.createElement("span",{className:s,style:{background:"#58cb42"}}))),a.createElement("div",{className:c},t))}},4743:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>l,contentTitle:()=>s,default:()=>u,frontMatter:()=>i,metadata:()=>c,toc:()=>m});var a=n(7462),r=(n(7294),n(3905)),o=n(5163);const i={},s="Command",c={unversionedId:"architecture/command",id:"architecture/command",title:"Command",description:"Commands are any action a user performs on your application. For example, RemoveItemFromCart, RatePhoto or AddCommentToPost. They express the intention of an user, and they are the main interaction mechanism of your application. They are a similar to the concept of a request on a REST API. Command issuers can also send data on a command as parameters.",source:"@site/docs/03_architecture/02_command.mdx",sourceDirName:"03_architecture",slug:"/architecture/command",permalink:"/architecture/command",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/03_architecture/02_command.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",sidebarPosition:2,frontMatter:{},sidebar:"docs",previous:{title:"Booster architecture",permalink:"/architecture/event-driven"},next:{title:"Event",permalink:"/architecture/event"}},l={},m=[{value:"Creating a command",id:"creating-a-command",level:2},{value:"Declaring a command",id:"declaring-a-command",level:2},{value:"The command handler function",id:"the-command-handler-function",level:2},{value:"Registering events",id:"registering-events",level:3},{value:"Returning a value",id:"returning-a-value",level:3},{value:"Validating data",id:"validating-data",level:3},{value:"Throw an error",id:"throw-an-error",level:4},{value:"Register error events",id:"register-error-events",level:4},{value:"Reading entities",id:"reading-entities",level:3},{value:"Authorizing a command",id:"authorizing-a-command",level:2},{value:"Submitting a command",id:"submitting-a-command",level:2},{value:"Commands naming convention",id:"commands-naming-convention",level:2}],d={toc:m};function u(e){let{components:t,...n}=e;return(0,r.kt)("wrapper",(0,a.Z)({},d,n,{components:t,mdxType:"MDXLayout"}),(0,r.kt)("h1",{id:"command"},"Command"),(0,r.kt)("p",null,"Commands are any action a user performs on your application. For example, ",(0,r.kt)("inlineCode",{parentName:"p"},"RemoveItemFromCart"),", ",(0,r.kt)("inlineCode",{parentName:"p"},"RatePhoto")," or ",(0,r.kt)("inlineCode",{parentName:"p"},"AddCommentToPost"),". They express the intention of an user, and they are the main interaction mechanism of your application. They are a similar to the concept of a ",(0,r.kt)("strong",{parentName:"p"},"request on a REST API"),". Command issuers can also send data on a command as parameters."),(0,r.kt)("h2",{id:"creating-a-command"},"Creating a command"),(0,r.kt)("p",null,"The Booster CLI will help you to create new commands. You just need to run the following command and the CLI will generate all the boilerplate for you:"),(0,r.kt)(o.Z,{mdxType:"TerminalWindow"},(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-shell"},"boost new:command CreateProduct --fields sku:SKU displayName:string description:string price:Money\n"))),(0,r.kt)("p",null,"This will generate a new file called ",(0,r.kt)("inlineCode",{parentName:"p"},"create-product")," in the ",(0,r.kt)("inlineCode",{parentName:"p"},"src/commands")," directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI."),(0,r.kt)("h2",{id:"declaring-a-command"},"Declaring a command"),(0,r.kt)("p",null,"In Booster you define them as TypeScript classes decorated with the ",(0,r.kt)("inlineCode",{parentName:"p"},"@Command")," decorator. The ",(0,r.kt)("inlineCode",{parentName:"p"},"Command")," parameters will be declared as properties of the class."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/command-name.ts"',title:'"src/commands/command-name.ts"'},"@Command()\nexport class CommandName {\n public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType) {}\n}\n")),(0,r.kt)("p",null,"These commands are handled by ",(0,r.kt)("inlineCode",{parentName:"p"},"Command Handlers"),", the same way a ",(0,r.kt)("strong",{parentName:"p"},"REST Controller")," do with a request. To create a ",(0,r.kt)("inlineCode",{parentName:"p"},"Command handler")," of a specific Command, you must declare a ",(0,r.kt)("inlineCode",{parentName:"p"},"handle")," class function inside the corresponding command you want to handle. For example:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/command-name.ts"',title:'"src/commands/command-name.ts"'},"@Command()\nexport class CommandName {\n public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType) {}\n\n // highlight-start\n public static async handle(command: CommandName, register: Register): Promise {\n // Validate inputs\n // Run domain logic\n // register.events([event1,...])\n }\n // highlight-end\n}\n")),(0,r.kt)("p",null,"Booster will then generate the GraphQL mutation for the corresponding command, and the infrastructure to handle them. You only have to define the class and the handler function. Commands are part of the public API, so you can define authorization policies for them, you can read more about this on ",(0,r.kt)("a",{parentName:"p",href:"/security/authorization"},"the authorization section"),"."),(0,r.kt)("admonition",{type:"tip"},(0,r.kt)("p",{parentName:"admonition"},"We recommend using command handlers to validate input data before registering events into the event store because they are immutable once there.")),(0,r.kt)("h2",{id:"the-command-handler-function"},"The command handler function"),(0,r.kt)("p",null,"Each command class must have a method called ",(0,r.kt)("inlineCode",{parentName:"p"},"handle"),". This function is the command handler, and it will be called by the framework every time one instance of this command is submitted. Inside the handler you can run validations, return errors, query entities to make decisions, and register relevant domain events."),(0,r.kt)("h3",{id:"registering-events"},"Registering events"),(0,r.kt)("p",null,"Within the command handler execution, it is possible to register domain events. The command handler function receives the ",(0,r.kt)("inlineCode",{parentName:"p"},"register")," argument, so within the handler, it is possible to call ",(0,r.kt)("inlineCode",{parentName:"p"},"register.events(...)")," with a list of events."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/create-product.ts"',title:'"src/commands/create-product.ts"'},"@Command()\nexport class CreateProduct {\n public constructor(readonly sku: string, readonly price: number) {}\n\n public static async handle(command: CreateProduct, register: Register): Promise {\n // highlight-next-line\n register.event(new ProductCreated(/*...*/))\n }\n}\n")),(0,r.kt)("p",null,"For more details about events and the register parameter, see the ",(0,r.kt)("a",{parentName:"p",href:"/architecture/event"},(0,r.kt)("inlineCode",{parentName:"a"},"Events"))," section."),(0,r.kt)("h3",{id:"returning-a-value"},"Returning a value"),(0,r.kt)("p",null,"The command handler function can return a value. This value will be the response of the GraphQL mutation. By default, the command handler function expects you to return a ",(0,r.kt)("inlineCode",{parentName:"p"},"void")," as a return type. Since GrahpQL does not have a ",(0,r.kt)("inlineCode",{parentName:"p"},"void")," type, the command handler function returns ",(0,r.kt)("inlineCode",{parentName:"p"},"true")," when called through the GraphQL. This is because the GraphQL specification requires a response, and ",(0,r.kt)("inlineCode",{parentName:"p"},"true")," is the most appropriate value to represent a successful execution with no return value."),(0,r.kt)("p",null,"If you want to return a value, you can change the return type of the handler function. For example, if you want to return a ",(0,r.kt)("inlineCode",{parentName:"p"},"string"),":"),(0,r.kt)("p",null,"For example:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/create-product.ts"',title:'"src/commands/create-product.ts"'},"@Command()\nexport class CreateProduct {\n public constructor(readonly sku: string, readonly price: number) {}\n\n public static async handle(command: CreateProduct, register: Register): Promise {\n register.event(new ProductCreated(/*...*/))\n // highlight-next-line\n return 'Product created!'\n }\n}\n")),(0,r.kt)("h3",{id:"validating-data"},"Validating data"),(0,r.kt)("admonition",{type:"tip"},(0,r.kt)("p",{parentName:"admonition"},"Booster uses the typed nature of GraphQL to ensure that types are correct before reaching the handler, so ",(0,r.kt)("strong",{parentName:"p"},"you don't have to validate types"),".")),(0,r.kt)("h4",{id:"throw-an-error"},"Throw an error"),(0,r.kt)("p",null,"A command will fail if there is an uncaught error during its handling. When a command fails, Booster will return a detailed error response with the message of the thrown error. This is useful for debugging, but it is also a security feature. Booster will never return an error stack trace to the client, so you don't have to worry about exposing internal implementation details."),(0,r.kt)("p",null,"One case where you might want to throw an error is when the command is invalid because it breaks a business rule. For example, if the command contains a negative price. In that case, you can throw an error in the handler. Booster will use the error's message as the response to make it descriptive. For example, given this command:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/create-product.ts"',title:'"src/commands/create-product.ts"'},"@Command()\nexport class CreateProduct {\n public constructor(readonly sku: string, readonly price: number) {}\n\n public static async handle(command: CreateProduct, register: Register): Promise {\n const priceLimit = 10\n if (command.price >= priceLimit) {\n // highlight-next-line\n throw new Error(`price must be below ${priceLimit}, and it was ${command.price}`)\n }\n }\n}\n")),(0,r.kt)("p",null,"You'll get something like this response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "errors": [\n {\n "message": "price must be below 10, and it was 19.99",\n "path": ["CreateProduct"]\n }\n ]\n}\n')),(0,r.kt)("h4",{id:"register-error-events"},"Register error events"),(0,r.kt)("p",null,"There could be situations in which you want to register an event representing an error. For example, when moving items with insufficient stock from one location to another:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/move-stock.ts"',title:'"src/commands/move-stock.ts"'},"@Command()\nexport class MoveStock {\n public constructor(\n readonly productID: string,\n readonly origin: string,\n readonly destination: string,\n readonly quantity: number\n ) {}\n\n public static async handle(command: MoveStock, register: Register): Promise {\n if (!command.enoughStock(command.productID, command.origin, command.quantity)) {\n // highlight-next-line\n register.events(new ErrorEvent(`There is not enough stock for ${command.productID} at ${command.origin}`))\n } else {\n register.events(new StockMoved(/*...*/))\n }\n }\n\n private enoughStock(productID: string, origin: string, quantity: number): boolean {\n /* ... */\n }\n}\n")),(0,r.kt)("p",null,"In this case, the command operation can still be completed. An event handler will take care of that `ErrorEvent and proceed accordingly."),(0,r.kt)("h3",{id:"reading-entities"},"Reading entities"),(0,r.kt)("p",null,"Event handlers are a good place to make decisions and, to make better decisions, you need information. The ",(0,r.kt)("inlineCode",{parentName:"p"},"Booster.entity")," function allows you to inspect the application state. This function receives two arguments, the ",(0,r.kt)("inlineCode",{parentName:"p"},"Entity"),"'s name to fetch and the ",(0,r.kt)("inlineCode",{parentName:"p"},"entityID"),". Here is an example of fetching an entity called ",(0,r.kt)("inlineCode",{parentName:"p"},"Stock"),":"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/move-stock.ts"',title:'"src/commands/move-stock.ts"'},"@Command()\nexport class MoveStock {\n public constructor(\n readonly productID: string,\n readonly origin: string,\n readonly destination: string,\n readonly quantity: number\n ) {}\n\n public static async handle(command: MoveStock, register: Register): Promise {\n // highlight-next-line\n const stock = await Booster.entity(Stock, command.productID)\n if (!command.enoughStock(command.origin, command.quantity, stock)) {\n register.events(new ErrorEvent(`There is not enough stock for ${command.productID} at ${command.origin}`))\n }\n }\n\n private enoughStock(origin: string, quantity: number, stock?: Stock): boolean {\n const count = stock?.countByLocation[origin]\n return !!count && count >= quantity\n }\n}\n")),(0,r.kt)("h2",{id:"authorizing-a-command"},"Authorizing a command"),(0,r.kt)("p",null,"Commands are part of the public API of a Booster application, so you can define who is authorized to submit them. All commands are protected by default, which means that no one can submit them. In order to allow users to submit a command, you must explicitly authorize them. You can use the ",(0,r.kt)("inlineCode",{parentName:"p"},"authorize")," field of the ",(0,r.kt)("inlineCode",{parentName:"p"},"@Command")," decorator to specify the authorization rule."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/create-product.ts"',title:'"src/commands/create-product.ts"'},"@Command({\n // highlight-next-line\n authorize: 'all',\n})\nexport class CreateProduct {\n public constructor(\n readonly sku: Sku,\n readonly displayName: string,\n readonly description: string,\n readonly price: number\n ) {}\n\n public static async handle(command: CreateProduct, register: Register): Promise {\n register.events(/* YOUR EVENT HERE */)\n }\n}\n")),(0,r.kt)("p",null,"You can read more about this on the ",(0,r.kt)("a",{parentName:"p",href:"/security/authorization"},"Authorization section"),"."),(0,r.kt)("h2",{id:"submitting-a-command"},"Submitting a command"),(0,r.kt)("p",null,"Booster commands are accessible to the outside world as GraphQL mutations. GrahpQL fits very well with Booster's CQRS approach because it has two kinds of operations: Mutations and Queries. Mutations are actions that modify the server-side data, just like commands."),(0,r.kt)("p",null,"Booster automatically creates one mutation per command. The framework infers the mutation input type from the command fields. Given this ",(0,r.kt)("inlineCode",{parentName:"p"},"CreateProduct")," command:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript"},"@Command({\n authorize: 'all',\n})\nexport class CreateProduct {\n public constructor(\n readonly sku: Sku,\n readonly displayName: string,\n readonly description: string,\n readonly price: number\n ) {}\n\n public static async handle(command: CreateProduct, register: Register): Promise {\n register.events(/* YOUR EVENT HERE */)\n }\n}\n")),(0,r.kt)("p",null,"Booster generates the following GraphQL mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-graphql"},"mutation CreateProduct($input: CreateProductInput!): Boolean\n")),(0,r.kt)("p",null,"where the schema for ",(0,r.kt)("inlineCode",{parentName:"p"},"CreateProductInput")," is"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-text"},"{\n sku: String\n displayName: String\n description: String\n price: Float\n}\n")),(0,r.kt)("h2",{id:"commands-naming-convention"},"Commands naming convention"),(0,r.kt)("p",null,"Semantics are very important in Booster as it will play an essential role in designing a coherent system. Your application should reflect your domain concepts, and commands are not an exception. Although you can name commands in any way you want, we strongly recommend you to ",(0,r.kt)("strong",{parentName:"p"},"name them starting with verbs in imperative plus the object being affected"),". If we were designing an e-commerce application, some commands would be:"),(0,r.kt)("ul",null,(0,r.kt)("li",{parentName:"ul"},"CreateProduct"),(0,r.kt)("li",{parentName:"ul"},"DeleteProduct"),(0,r.kt)("li",{parentName:"ul"},"UpdateProduct"),(0,r.kt)("li",{parentName:"ul"},"ChangeCartItems"),(0,r.kt)("li",{parentName:"ul"},"ConfirmPayment"),(0,r.kt)("li",{parentName:"ul"},"MoveStock"),(0,r.kt)("li",{parentName:"ul"},"UpdateCartShippingAddress")),(0,r.kt)("p",null,"Despite you can place commands, and other Booster files, in any directory, we strongly recommend you to put them in ",(0,r.kt)("inlineCode",{parentName:"p"},"/src/commands"),". Having all the commands in one place will help you to understand your application's capabilities at a glance."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-text"},"\n\u251c\u2500\u2500 src\n\u2502\xa0\xa0 \u251c\u2500\u2500 commands <------ put them here\n\u2502\xa0\xa0 \u251c\u2500\u2500 common\n\u2502\xa0\xa0 \u251c\u2500\u2500 config\n\u2502\xa0\xa0 \u251c\u2500\u2500 entities\n\u2502\xa0\xa0 \u251c\u2500\u2500 events\n\u2502\xa0\xa0 \u251c\u2500\u2500 index.ts\n\u2502\xa0\xa0 \u2514\u2500\u2500 read-models\n")))}u.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/5b078add.59c0d427.js b/assets/js/5b078add.59c0d427.js new file mode 100644 index 000000000..8a3d73467 --- /dev/null +++ b/assets/js/5b078add.59c0d427.js @@ -0,0 +1 @@ +"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[1422],{3905:(e,t,n)=>{n.d(t,{Zo:()=>m,kt:()=>p});var a=n(7294);function r(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function o(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);t&&(a=a.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,a)}return n}function i(e){for(var t=1;t=0||(r[n]=e[n]);return r}(e,t);if(Object.getOwnPropertySymbols){var o=Object.getOwnPropertySymbols(e);for(a=0;a=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(r[n]=e[n])}return r}var c=a.createContext({}),l=function(e){var t=a.useContext(c),n=t;return e&&(n="function"==typeof e?e(t):i(i({},t),e)),n},m=function(e){var t=l(e.components);return a.createElement(c.Provider,{value:t},e.children)},d={inlineCode:"code",wrapper:function(e){var t=e.children;return a.createElement(a.Fragment,{},t)}},u=a.forwardRef((function(e,t){var n=e.components,r=e.mdxType,o=e.originalType,c=e.parentName,m=s(e,["components","mdxType","originalType","parentName"]),u=l(n),p=r,h=u["".concat(c,".").concat(p)]||u[p]||d[p]||o;return n?a.createElement(h,i(i({ref:t},m),{},{components:n})):a.createElement(h,i({ref:t},m))}));function p(e,t){var n=arguments,r=t&&t.mdxType;if("string"==typeof e||r){var o=n.length,i=new Array(o);i[0]=u;var s={};for(var c in t)hasOwnProperty.call(t,c)&&(s[c]=t[c]);s.originalType=e,s.mdxType="string"==typeof e?e:r,i[1]=s;for(var l=2;l{n.d(t,{Z:()=>l});var a=n(7294);const r="terminalWindow_wGrl",o="terminalWindowHeader_o9Cs",i="buttons_IGLB",s="dot_fGZE",c="terminalWindowBody_tzdS";function l(e){let{children:t}=e;return a.createElement("div",{className:r},a.createElement("div",{className:o},a.createElement("div",{className:i},a.createElement("span",{className:s,style:{background:"#f25f58"}}),a.createElement("span",{className:s,style:{background:"#fbbe3c"}}),a.createElement("span",{className:s,style:{background:"#58cb42"}}))),a.createElement("div",{className:c},t))}},4743:(e,t,n)=>{n.r(t),n.d(t,{assets:()=>l,contentTitle:()=>s,default:()=>u,frontMatter:()=>i,metadata:()=>c,toc:()=>m});var a=n(7462),r=(n(7294),n(3905)),o=n(5163);const i={},s="Command",c={unversionedId:"architecture/command",id:"architecture/command",title:"Command",description:"Commands are any action a user performs on your application. For example, RemoveItemFromCart, RatePhoto or AddCommentToPost. They express the intention of an user, and they are the main interaction mechanism of your application. They are a similar to the concept of a request on a REST API. Command issuers can also send data on a command as parameters.",source:"@site/docs/03_architecture/02_command.mdx",sourceDirName:"03_architecture",slug:"/architecture/command",permalink:"/architecture/command",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/03_architecture/02_command.mdx",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1697652123,formattedLastUpdatedAt:"Oct 18, 2023",sidebarPosition:2,frontMatter:{},sidebar:"docs",previous:{title:"Booster architecture",permalink:"/architecture/event-driven"},next:{title:"Event",permalink:"/architecture/event"}},l={},m=[{value:"Creating a command",id:"creating-a-command",level:2},{value:"Declaring a command",id:"declaring-a-command",level:2},{value:"The command handler function",id:"the-command-handler-function",level:2},{value:"Registering events",id:"registering-events",level:3},{value:"Returning a value",id:"returning-a-value",level:3},{value:"Validating data",id:"validating-data",level:3},{value:"Throw an error",id:"throw-an-error",level:4},{value:"Register error events",id:"register-error-events",level:4},{value:"Reading entities",id:"reading-entities",level:3},{value:"Authorizing a command",id:"authorizing-a-command",level:2},{value:"Submitting a command",id:"submitting-a-command",level:2},{value:"Commands naming convention",id:"commands-naming-convention",level:2}],d={toc:m};function u(e){let{components:t,...n}=e;return(0,r.kt)("wrapper",(0,a.Z)({},d,n,{components:t,mdxType:"MDXLayout"}),(0,r.kt)("h1",{id:"command"},"Command"),(0,r.kt)("p",null,"Commands are any action a user performs on your application. For example, ",(0,r.kt)("inlineCode",{parentName:"p"},"RemoveItemFromCart"),", ",(0,r.kt)("inlineCode",{parentName:"p"},"RatePhoto")," or ",(0,r.kt)("inlineCode",{parentName:"p"},"AddCommentToPost"),". They express the intention of an user, and they are the main interaction mechanism of your application. They are a similar to the concept of a ",(0,r.kt)("strong",{parentName:"p"},"request on a REST API"),". Command issuers can also send data on a command as parameters."),(0,r.kt)("h2",{id:"creating-a-command"},"Creating a command"),(0,r.kt)("p",null,"The Booster CLI will help you to create new commands. You just need to run the following command and the CLI will generate all the boilerplate for you:"),(0,r.kt)(o.Z,{mdxType:"TerminalWindow"},(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-shell"},"boost new:command CreateProduct --fields sku:SKU displayName:string description:string price:Money\n"))),(0,r.kt)("p",null,"This will generate a new file called ",(0,r.kt)("inlineCode",{parentName:"p"},"create-product")," in the ",(0,r.kt)("inlineCode",{parentName:"p"},"src/commands")," directory. You can also create the file manually, but you will need to create the class and decorate it, so we recommend using the CLI."),(0,r.kt)("h2",{id:"declaring-a-command"},"Declaring a command"),(0,r.kt)("p",null,"In Booster you define them as TypeScript classes decorated with the ",(0,r.kt)("inlineCode",{parentName:"p"},"@Command")," decorator. The ",(0,r.kt)("inlineCode",{parentName:"p"},"Command")," parameters will be declared as properties of the class."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/command-name.ts"',title:'"src/commands/command-name.ts"'},"@Command()\nexport class CommandName {\n public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType) {}\n}\n")),(0,r.kt)("p",null,"These commands are handled by ",(0,r.kt)("inlineCode",{parentName:"p"},"Command Handlers"),", the same way a ",(0,r.kt)("strong",{parentName:"p"},"REST Controller")," do with a request. To create a ",(0,r.kt)("inlineCode",{parentName:"p"},"Command handler")," of a specific Command, you must declare a ",(0,r.kt)("inlineCode",{parentName:"p"},"handle")," class function inside the corresponding command you want to handle. For example:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/command-name.ts"',title:'"src/commands/command-name.ts"'},"@Command()\nexport class CommandName {\n public constructor(readonly fieldA: SomeType, readonly fieldB: SomeOtherType) {}\n\n // highlight-start\n public static async handle(command: CommandName, register: Register): Promise {\n // Validate inputs\n // Run domain logic\n // register.events([event1,...])\n }\n // highlight-end\n}\n")),(0,r.kt)("p",null,"Booster will then generate the GraphQL mutation for the corresponding command, and the infrastructure to handle them. You only have to define the class and the handler function. Commands are part of the public API, so you can define authorization policies for them, you can read more about this on ",(0,r.kt)("a",{parentName:"p",href:"/security/authorization"},"the authorization section"),"."),(0,r.kt)("admonition",{type:"tip"},(0,r.kt)("p",{parentName:"admonition"},"We recommend using command handlers to validate input data before registering events into the event store because they are immutable once there.")),(0,r.kt)("h2",{id:"the-command-handler-function"},"The command handler function"),(0,r.kt)("p",null,"Each command class must have a method called ",(0,r.kt)("inlineCode",{parentName:"p"},"handle"),". This function is the command handler, and it will be called by the framework every time one instance of this command is submitted. Inside the handler you can run validations, return errors, query entities to make decisions, and register relevant domain events."),(0,r.kt)("h3",{id:"registering-events"},"Registering events"),(0,r.kt)("p",null,"Within the command handler execution, it is possible to register domain events. The command handler function receives the ",(0,r.kt)("inlineCode",{parentName:"p"},"register")," argument, so within the handler, it is possible to call ",(0,r.kt)("inlineCode",{parentName:"p"},"register.events(...)")," with a list of events."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/create-product.ts"',title:'"src/commands/create-product.ts"'},"@Command()\nexport class CreateProduct {\n public constructor(readonly sku: string, readonly price: number) {}\n\n public static async handle(command: CreateProduct, register: Register): Promise {\n // highlight-next-line\n register.event(new ProductCreated(/*...*/))\n }\n}\n")),(0,r.kt)("p",null,"For more details about events and the register parameter, see the ",(0,r.kt)("a",{parentName:"p",href:"/architecture/event"},(0,r.kt)("inlineCode",{parentName:"a"},"Events"))," section."),(0,r.kt)("h3",{id:"returning-a-value"},"Returning a value"),(0,r.kt)("p",null,"The command handler function can return a value. This value will be the response of the GraphQL mutation. By default, the command handler function expects you to return a ",(0,r.kt)("inlineCode",{parentName:"p"},"void")," as a return type. Since GrahpQL does not have a ",(0,r.kt)("inlineCode",{parentName:"p"},"void")," type, the command handler function returns ",(0,r.kt)("inlineCode",{parentName:"p"},"true")," when called through the GraphQL. This is because the GraphQL specification requires a response, and ",(0,r.kt)("inlineCode",{parentName:"p"},"true")," is the most appropriate value to represent a successful execution with no return value."),(0,r.kt)("p",null,"If you want to return a value, you can change the return type of the handler function. For example, if you want to return a ",(0,r.kt)("inlineCode",{parentName:"p"},"string"),":"),(0,r.kt)("p",null,"For example:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/create-product.ts"',title:'"src/commands/create-product.ts"'},"@Command()\nexport class CreateProduct {\n public constructor(readonly sku: string, readonly price: number) {}\n\n public static async handle(command: CreateProduct, register: Register): Promise {\n register.event(new ProductCreated(/*...*/))\n // highlight-next-line\n return 'Product created!'\n }\n}\n")),(0,r.kt)("h3",{id:"validating-data"},"Validating data"),(0,r.kt)("admonition",{type:"tip"},(0,r.kt)("p",{parentName:"admonition"},"Booster uses the typed nature of GraphQL to ensure that types are correct before reaching the handler, so ",(0,r.kt)("strong",{parentName:"p"},"you don't have to validate types"),".")),(0,r.kt)("h4",{id:"throw-an-error"},"Throw an error"),(0,r.kt)("p",null,"A command will fail if there is an uncaught error during its handling. When a command fails, Booster will return a detailed error response with the message of the thrown error. This is useful for debugging, but it is also a security feature. Booster will never return an error stack trace to the client, so you don't have to worry about exposing internal implementation details."),(0,r.kt)("p",null,"One case where you might want to throw an error is when the command is invalid because it breaks a business rule. For example, if the command contains a negative price. In that case, you can throw an error in the handler. Booster will use the error's message as the response to make it descriptive. For example, given this command:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/create-product.ts"',title:'"src/commands/create-product.ts"'},"@Command()\nexport class CreateProduct {\n public constructor(readonly sku: string, readonly price: number) {}\n\n public static async handle(command: CreateProduct, register: Register): Promise {\n const priceLimit = 10\n if (command.price >= priceLimit) {\n // highlight-next-line\n throw new Error(`price must be below ${priceLimit}, and it was ${command.price}`)\n }\n }\n}\n")),(0,r.kt)("p",null,"You'll get something like this response:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-json"},'{\n "errors": [\n {\n "message": "price must be below 10, and it was 19.99",\n "path": ["CreateProduct"]\n }\n ]\n}\n')),(0,r.kt)("h4",{id:"register-error-events"},"Register error events"),(0,r.kt)("p",null,"There could be situations in which you want to register an event representing an error. For example, when moving items with insufficient stock from one location to another:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/move-stock.ts"',title:'"src/commands/move-stock.ts"'},"@Command()\nexport class MoveStock {\n public constructor(\n readonly productID: string,\n readonly origin: string,\n readonly destination: string,\n readonly quantity: number\n ) {}\n\n public static async handle(command: MoveStock, register: Register): Promise {\n if (!command.enoughStock(command.productID, command.origin, command.quantity)) {\n // highlight-next-line\n register.events(new ErrorEvent(`There is not enough stock for ${command.productID} at ${command.origin}`))\n } else {\n register.events(new StockMoved(/*...*/))\n }\n }\n\n private enoughStock(productID: string, origin: string, quantity: number): boolean {\n /* ... */\n }\n}\n")),(0,r.kt)("p",null,"In this case, the command operation can still be completed. An event handler will take care of that `ErrorEvent and proceed accordingly."),(0,r.kt)("h3",{id:"reading-entities"},"Reading entities"),(0,r.kt)("p",null,"Event handlers are a good place to make decisions and, to make better decisions, you need information. The ",(0,r.kt)("inlineCode",{parentName:"p"},"Booster.entity")," function allows you to inspect the application state. This function receives two arguments, the ",(0,r.kt)("inlineCode",{parentName:"p"},"Entity"),"'s name to fetch and the ",(0,r.kt)("inlineCode",{parentName:"p"},"entityID"),". Here is an example of fetching an entity called ",(0,r.kt)("inlineCode",{parentName:"p"},"Stock"),":"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/move-stock.ts"',title:'"src/commands/move-stock.ts"'},"@Command()\nexport class MoveStock {\n public constructor(\n readonly productID: string,\n readonly origin: string,\n readonly destination: string,\n readonly quantity: number\n ) {}\n\n public static async handle(command: MoveStock, register: Register): Promise {\n // highlight-next-line\n const stock = await Booster.entity(Stock, command.productID)\n if (!command.enoughStock(command.origin, command.quantity, stock)) {\n register.events(new ErrorEvent(`There is not enough stock for ${command.productID} at ${command.origin}`))\n }\n }\n\n private enoughStock(origin: string, quantity: number, stock?: Stock): boolean {\n const count = stock?.countByLocation[origin]\n return !!count && count >= quantity\n }\n}\n")),(0,r.kt)("h2",{id:"authorizing-a-command"},"Authorizing a command"),(0,r.kt)("p",null,"Commands are part of the public API of a Booster application, so you can define who is authorized to submit them. All commands are protected by default, which means that no one can submit them. In order to allow users to submit a command, you must explicitly authorize them. You can use the ",(0,r.kt)("inlineCode",{parentName:"p"},"authorize")," field of the ",(0,r.kt)("inlineCode",{parentName:"p"},"@Command")," decorator to specify the authorization rule."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript",metastring:'title="src/commands/create-product.ts"',title:'"src/commands/create-product.ts"'},"@Command({\n // highlight-next-line\n authorize: 'all',\n})\nexport class CreateProduct {\n public constructor(\n readonly sku: Sku,\n readonly displayName: string,\n readonly description: string,\n readonly price: number\n ) {}\n\n public static async handle(command: CreateProduct, register: Register): Promise {\n register.events(/* YOUR EVENT HERE */)\n }\n}\n")),(0,r.kt)("p",null,"You can read more about this on the ",(0,r.kt)("a",{parentName:"p",href:"/security/authorization"},"Authorization section"),"."),(0,r.kt)("h2",{id:"submitting-a-command"},"Submitting a command"),(0,r.kt)("p",null,"Booster commands are accessible to the outside world as GraphQL mutations. GrahpQL fits very well with Booster's CQRS approach because it has two kinds of operations: Mutations and Queries. Mutations are actions that modify the server-side data, just like commands."),(0,r.kt)("p",null,"Booster automatically creates one mutation per command. The framework infers the mutation input type from the command fields. Given this ",(0,r.kt)("inlineCode",{parentName:"p"},"CreateProduct")," command:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-typescript"},"@Command({\n authorize: 'all',\n})\nexport class CreateProduct {\n public constructor(\n readonly sku: Sku,\n readonly displayName: string,\n readonly description: string,\n readonly price: number\n ) {}\n\n public static async handle(command: CreateProduct, register: Register): Promise {\n register.events(/* YOUR EVENT HERE */)\n }\n}\n")),(0,r.kt)("p",null,"Booster generates the following GraphQL mutation:"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-graphql"},"mutation CreateProduct($input: CreateProductInput!): Boolean\n")),(0,r.kt)("p",null,"where the schema for ",(0,r.kt)("inlineCode",{parentName:"p"},"CreateProductInput")," is"),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-text"},"{\n sku: String\n displayName: String\n description: String\n price: Float\n}\n")),(0,r.kt)("h2",{id:"commands-naming-convention"},"Commands naming convention"),(0,r.kt)("p",null,"Semantics are very important in Booster as it will play an essential role in designing a coherent system. Your application should reflect your domain concepts, and commands are not an exception. Although you can name commands in any way you want, we strongly recommend you to ",(0,r.kt)("strong",{parentName:"p"},"name them starting with verbs in imperative plus the object being affected"),". If we were designing an e-commerce application, some commands would be:"),(0,r.kt)("ul",null,(0,r.kt)("li",{parentName:"ul"},"CreateProduct"),(0,r.kt)("li",{parentName:"ul"},"DeleteProduct"),(0,r.kt)("li",{parentName:"ul"},"UpdateProduct"),(0,r.kt)("li",{parentName:"ul"},"ChangeCartItems"),(0,r.kt)("li",{parentName:"ul"},"ConfirmPayment"),(0,r.kt)("li",{parentName:"ul"},"MoveStock"),(0,r.kt)("li",{parentName:"ul"},"UpdateCartShippingAddress")),(0,r.kt)("p",null,"Despite you can place commands, and other Booster files, in any directory, we strongly recommend you to put them in ",(0,r.kt)("inlineCode",{parentName:"p"},"/src/commands"),". Having all the commands in one place will help you to understand your application's capabilities at a glance."),(0,r.kt)("pre",null,(0,r.kt)("code",{parentName:"pre",className:"language-text"},"\n\u251c\u2500\u2500 src\n\u2502\xa0\xa0 \u251c\u2500\u2500 commands <------ put them here\n\u2502\xa0\xa0 \u251c\u2500\u2500 common\n\u2502\xa0\xa0 \u251c\u2500\u2500 config\n\u2502\xa0\xa0 \u251c\u2500\u2500 entities\n\u2502\xa0\xa0 \u251c\u2500\u2500 events\n\u2502\xa0\xa0 \u251c\u2500\u2500 index.ts\n\u2502\xa0\xa0 \u2514\u2500\u2500 read-models\n")))}u.isMDXComponent=!0}}]); \ No newline at end of file diff --git a/assets/js/5e911e87.5eabaf17.js b/assets/js/5e911e87.5eabaf17.js deleted file mode 100644 index 93753e2a2..000000000 --- a/assets/js/5e911e87.5eabaf17.js +++ /dev/null @@ -1 +0,0 @@ -"use strict";(self.webpackChunkwebsite=self.webpackChunkwebsite||[]).push([[9089],{3905:(e,t,n)=>{n.d(t,{Zo:()=>p,kt:()=>h});var a=n(7294);function o(e,t,n){return t in e?Object.defineProperty(e,t,{value:n,enumerable:!0,configurable:!0,writable:!0}):e[t]=n,e}function i(e,t){var n=Object.keys(e);if(Object.getOwnPropertySymbols){var a=Object.getOwnPropertySymbols(e);t&&(a=a.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),n.push.apply(n,a)}return n}function r(e){for(var t=1;t=0||(o[n]=e[n]);return o}(e,t);if(Object.getOwnPropertySymbols){var i=Object.getOwnPropertySymbols(e);for(a=0;a=0||Object.prototype.propertyIsEnumerable.call(e,n)&&(o[n]=e[n])}return o}var l=a.createContext({}),u=function(e){var t=a.useContext(l),n=t;return e&&(n="function"==typeof e?e(t):r(r({},t),e)),n},p=function(e){var t=u(e.components);return a.createElement(l.Provider,{value:t},e.children)},c={inlineCode:"code",wrapper:function(e){var t=e.children;return a.createElement(a.Fragment,{},t)}},d=a.forwardRef((function(e,t){var n=e.components,o=e.mdxType,i=e.originalType,l=e.parentName,p=s(e,["components","mdxType","originalType","parentName"]),d=u(n),h=o,m=d["".concat(l,".").concat(h)]||d[h]||c[h]||i;return n?a.createElement(m,r(r({ref:t},p),{},{components:n})):a.createElement(m,r({ref:t},p))}));function h(e,t){var n=arguments,o=t&&t.mdxType;if("string"==typeof e||o){var i=n.length,r=new Array(i);r[0]=d;var s={};for(var l in t)hasOwnProperty.call(t,l)&&(s[l]=t[l]);s.originalType=e,s.mdxType="string"==typeof e?e:o,r[1]=s;for(var u=2;u{n.r(t),n.d(t,{assets:()=>l,contentTitle:()=>r,default:()=>c,frontMatter:()=>i,metadata:()=>s,toc:()=>u});var a=n(7462),o=(n(7294),n(3905));const i={},r="Contributing to Booster",s={unversionedId:"contributing",id:"contributing",title:"Contributing to Booster",description:"DISCLAIMER: The Booster docs are undergoing an overhaul. Most of what's written here applies, but expect some hiccups in the build process",source:"@site/docs/12_contributing.md",sourceDirName:".",slug:"/contributing",permalink:"/contributing",draft:!1,editUrl:"https://github.com/boostercloud/booster/tree/main/website/docs/12_contributing.md",tags:[],version:"current",lastUpdatedBy:"Javier Toledo",lastUpdatedAt:1696768385,formattedLastUpdatedAt:"Oct 8, 2023",sidebarPosition:12,frontMatter:{},sidebar:"docs",previous:{title:"Frequently Asked Questions",permalink:"/frequently-asked-questions"}},l={},u=[{value:"Code of Conduct",id:"code-of-conduct",level:2},{value:"I don't want to read this whole thing, I just have a question",id:"i-dont-want-to-read-this-whole-thing-i-just-have-a-question",level:2},{value:"What should I know before I get started?",id:"what-should-i-know-before-i-get-started",level:2},{value:"Packages",id:"packages",level:3},{value:"How Can I Contribute?",id:"how-can-i-contribute",level:2},{value:"Reporting Bugs",id:"reporting-bugs",level:3},{value:"Suggesting Enhancements",id:"suggesting-enhancements",level:3},{value:"Improving documentation",id:"improving-documentation",level:3},{value:"Documentation principles and practices",id:"documentation-principles-and-practices",level:4},{value:"Principles",id:"principles",level:5},{value:"Practices",id:"practices",level:5},{value:"Create your very first GitHub issue",id:"create-your-very-first-github-issue",level:3},{value:"Your First Code Contribution",id:"your-first-code-contribution",level:2},{value:"Getting the code",id:"getting-the-code",level:3},{value:"Understanding the "rush monorepo" approach and how dependencies are structured in the project",id:"understanding-the-rush-monorepo-approach-and-how-dependencies-are-structured-in-the-project",level:3},{value:"Running unit tests",id:"running-unit-tests",level:3},{value:"Running integration tests",id:"running-integration-tests",level:3},{value:"Github flow",id:"github-flow",level:3},{value:"Publishing your Pull Request",id:"publishing-your-pull-request",level:3},{value:"Branch naming conventions",id:"branch-naming-conventions",level:3},{value:"Commit message guidelines",id:"commit-message-guidelines",level:3},{value:"Code Style Guidelines",id:"code-style-guidelines",level:2},{value:"Importing other files and libraries",id:"importing-other-files-and-libraries",level:3},{value:"Functional style",id:"functional-style",level:3},{value:"Use const and let",id:"use-const-and-let",level:3}],p={toc:u};function c(e){let{components:t,...n}=e;return(0,o.kt)("wrapper",(0,a.Z)({},p,n,{components:t,mdxType:"MDXLayout"}),(0,o.kt)("h1",{id:"contributing-to-booster"},"Contributing to Booster"),(0,o.kt)("blockquote",null,(0,o.kt)("p",{parentName:"blockquote"},(0,o.kt)("strong",{parentName:"p"},"DISCLAIMER:")," The Booster docs are undergoing an overhaul. Most of what's written here applies, but expect some hiccups in the build process\nthat is described here, as it changed in the last version. New documentation will have this documented properly.")),(0,o.kt)("p",null,"Thanks for taking the time to contribute to Booster. It is an open-source project and it wouldn't be possible without people like you \ud83d\ude4f\ud83c\udf89"),(0,o.kt)("p",null,"This document is a set of guidelines to help you contribute to Booster, which is hosted on the ",(0,o.kt)("a",{parentName:"p",href:"https://github.com/boostercloud"},(0,o.kt)("inlineCode",{parentName:"a"},"boostercloud"))," GitHub\norganization. These aren\u2019t absolute laws, use your judgment and common sense \ud83d\ude00.\nRemember that if something here doesn't make sense, you can also propose a change to this document."),(0,o.kt)("h2",{id:"code-of-conduct"},"Code of Conduct"),(0,o.kt)("p",null,"This project and everyone participating in it are expected to uphold the ",(0,o.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/booster/blob/main/CODE_OF_CONDUCT.md"},"Booster's Code of Conduct"),", based on the Covenant Code of Conduct.\nIf you see unacceptable behavior, please communicate so to ",(0,o.kt)("inlineCode",{parentName:"p"},"hello@booster.cloud"),"."),(0,o.kt)("h2",{id:"i-dont-want-to-read-this-whole-thing-i-just-have-a-question"},"I don't want to read this whole thing, I just have a question"),(0,o.kt)("p",null,"Go ahead and ask the community in ",(0,o.kt)("a",{parentName:"p",href:"https://discord.com/invite/bDY8MKx"},"Discord")," or ",(0,o.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/booster/issues"},"create a new issue"),"."),(0,o.kt)("h2",{id:"what-should-i-know-before-i-get-started"},"What should I know before I get started?"),(0,o.kt)("h3",{id:"packages"},"Packages"),(0,o.kt)("p",null,"Booster is divided in many different packages. The criteria to split the code in packages is that each package meets at least one of the following conditions:"),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},"They must be run separately, for instance, the CLI is run locally, while the support code for the project is run on the cloud."),(0,o.kt)("li",{parentName:"ul"},"They contain code that is used by at least two of the other packages."),(0,o.kt)("li",{parentName:"ul"},"They're a vendor-specific specialization of some abstract part of the framework (for instance, all the code that is required by AWS is in separate packages).")),(0,o.kt)("p",null,"The packages are managed using ",(0,o.kt)("a",{parentName:"p",href:"https://rushjs.io/"},"rush")," and ",(0,o.kt)("a",{parentName:"p",href:"https://npmjs.com"},"npm"),", if you run ",(0,o.kt)("inlineCode",{parentName:"p"},"rush build"),", it will build all the packages."),(0,o.kt)("p",null,"The packages are published to ",(0,o.kt)("inlineCode",{parentName:"p"},"npmjs")," under the prefix ",(0,o.kt)("inlineCode",{parentName:"p"},"@boostercloud/"),", their purpose is as follows:"),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"cli")," - You guessed it! This package is the ",(0,o.kt)("inlineCode",{parentName:"li"},"boost")," command-line tool, it interacts only with the core package in order to load the project configuration. The specific provider packages to interact with the cloud providers are loaded dynamically from the project config."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"framework-core")," - This one contains all the framework runtime vendor-independent logic. Stuff like the generation of the config or the commands and events handling happens here. The specific provider packages to interact with the cloud providers are loaded dynamically from the project config."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"framework-integration-tests")," - Implements integration tests for all supported vendors. Tests are run on real infrastructure using the same mechanisms than a production application. This package ",(0,o.kt)("inlineCode",{parentName:"li"},"src")," folder includes a synthetic Booster application that can be deployed to a real provider for testing purposes."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"framework-provider-aws")," - Implements all the required adapters to make the booster core run on top of AWS technologies like Lambda and DynamoDB using the AWS SDK under the hoods."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"framework-provider-aws-infrastructure")," - Implements all the required adapters to allow Booster applications to be deployed to AWS using the AWS CDK under the hoods."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"framework-provider-local")," - Implements all the required adapters to run the Booster application on a local express server to be able to debug your code before deploying it to a real cloud provider."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"framework-provider-local-infrastructure")," - Implements all the required code to run the local development server."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"framework-types")," - This package defines types that the rest of the project will use. This is useful for avoiding cyclic dependencies. Note that this package should not contain stuff that are not types, or very simple methods related directly to them, i.e. a getter or setter. This package defines the main booster concepts like:",(0,o.kt)("ul",{parentName:"li"},(0,o.kt)("li",{parentName:"ul"},"Entity"),(0,o.kt)("li",{parentName:"ul"},"Command"),(0,o.kt)("li",{parentName:"ul"},"etc\u2026")))),(0,o.kt)("p",null,"This is a dependency graph that shows the dependencies among all packages, including the application using Booster:\n",(0,o.kt)("img",{parentName:"p",src:"https://raw.githubusercontent.com/boostercloud/booster/main/docs/img/packages-dependencies.png",alt:"Booster packages dependencies"})),(0,o.kt)("h2",{id:"how-can-i-contribute"},"How Can I Contribute?"),(0,o.kt)("p",null,"Contributing to an open source project is never just a matter of code, you can help us significantly by just using Booster and interacting with our community. Here you'll find some tips on how to do it effectively."),(0,o.kt)("h3",{id:"reporting-bugs"},"Reporting Bugs"),(0,o.kt)("p",null,"Before creating a bug report, please search for similar issues to make sure that they're not already reported. If you don't find any, go ahead and create an issue including as many details as possible. Fill out the required template, the information requested helps us to resolve issues faster."),(0,o.kt)("blockquote",null,(0,o.kt)("p",{parentName:"blockquote"},(0,o.kt)("strong",{parentName:"p"},"Note"),": If you find a Closed issue that seems related to the issues that you're experiencing, make sure to reference it in the body of your new one by writing its number like this => #42 (Github will autolink it for you).")),(0,o.kt)("p",null,"Bugs are tracked as GitHub issues. Explain the problem and include additional details to help maintainers reproduce the problem:"),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},"Use a clear and descriptive title for the issue to identify the problem."),(0,o.kt)("li",{parentName:"ul"},"Describe the exact steps which reproduce the problem in as many details as possible."),(0,o.kt)("li",{parentName:"ul"},"Provide specific examples to demonstrate the steps. Include links to files or GitHub projects, or copy/pasteable snippets, which you use in those examples. If you're providing snippets in the issue, use Markdown code blocks."),(0,o.kt)("li",{parentName:"ul"},"Describe the behavior you observed after following the steps and point out what exactly is the problem with that behavior."),(0,o.kt)("li",{parentName:"ul"},"Explain which behavior you expected to see instead and why."),(0,o.kt)("li",{parentName:"ul"},"If the problem is related to performance or memory, include a CPU profile capture with your report.")),(0,o.kt)("h3",{id:"suggesting-enhancements"},"Suggesting Enhancements"),(0,o.kt)("p",null,"Enhancement suggestions are tracked as GitHub issues. Make sure you provide the following information:"),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},"Use a clear and descriptive title for the issue to identify the suggestion."),(0,o.kt)("li",{parentName:"ul"},"Provide a step-by-step description of the suggested enhancement in as many details as possible."),(0,o.kt)("li",{parentName:"ul"},"Provide specific examples to demonstrate the steps. Include copy/pasteable snippets which you use in those examples, as Markdown code blocks."),(0,o.kt)("li",{parentName:"ul"},"Describe the current behavior and explain which behavior you expected to see instead and why."),(0,o.kt)("li",{parentName:"ul"},"Explain why this enhancement would be useful to most Booster users and isn't something that can or should be implemented as a community package."),(0,o.kt)("li",{parentName:"ul"},"List some other libraries or frameworks where this enhancement exists.")),(0,o.kt)("h3",{id:"improving-documentation"},"Improving documentation"),(0,o.kt)("p",null,(0,o.kt)("a",{parentName:"p",href:"https://docs.boosterframework.com"},"Booster documentation"),' is treated as a live document that continues improving on a daily basis. If you find something that is missing or can be improved, please contribute, it will be of great help for other developers.\nTo contribute you can use the button "Edit on github" at the top of each chapter.'),(0,o.kt)("h4",{id:"documentation-principles-and-practices"},"Documentation principles and practices"),(0,o.kt)("p",null,"The ultimate goal of a technical document is to translate the knowledge from the technology creators into the reader's mind so that they learn. The challenging\npart here is the one in which they learn. It is challenging because, under the same amount of information, a person can suffer an information overload because\nwe (humans) don't have the same information-processing capacity. That idea is going to work as our compass, it should drive our efforts so people with less\ncapacity is still able to follow and understand our documentation."),(0,o.kt)("p",null,"To achieve our goal we propose writing documentation following these principles:"),(0,o.kt)("ol",null,(0,o.kt)("li",{parentName:"ol"},"Clean and Clear"),(0,o.kt)("li",{parentName:"ol"},"Simple"),(0,o.kt)("li",{parentName:"ol"},"Coherent"),(0,o.kt)("li",{parentName:"ol"},"Explicit"),(0,o.kt)("li",{parentName:"ol"},"Attractive"),(0,o.kt)("li",{parentName:"ol"},"Inclusive"),(0,o.kt)("li",{parentName:"ol"},"Cohesive")),(0,o.kt)("h5",{id:"principles"},"Principles"),(0,o.kt)("p",null,(0,o.kt)("strong",{parentName:"p"},"1. Clean and Clear")),(0,o.kt)("p",null,"Less is more. Apple is, among many others, a good example of creating clean and clear content, where visual elements are carefully chosen to look beautiful\n(e.g. ",(0,o.kt)("a",{parentName:"p",href:"https://developer.apple.com/tutorials/swiftui"},"Apple's swift UI"),") and making the reader getting the point as soon as possible."),(0,o.kt)("p",null,"The intention of every section, paragraph, and sentence must be clear, we should avoid writing details of two different things even when they are related.\nIt is better to link pages and keep the focus and the intention clear, Wikipedia is the best example on this."),(0,o.kt)("p",null,(0,o.kt)("strong",{parentName:"p"},"2. Simple")),(0,o.kt)("p",null,"Technical writings deal with different backgrounds and expertise from the readers. We should not assume the reader knows everything we are talking about\nbut we should not explain everything in the same paragraph or section. Every section has a goal to stick to the goal and link to internal or external resources\nto go deeper."),(0,o.kt)("p",null,"Diagrams are great tools, you know a picture is worth more than a thousand words unless that picture contains too much information.\nKeep it simple intentionally omitting details."),(0,o.kt)("p",null,(0,o.kt)("strong",{parentName:"p"},"3. Coherent")),(0,o.kt)("p",null,"The documentation tells a story. Every section should integrate naturally without making the reader switch between different contexts. Text, diagrams,\nand code examples should support each other without introducing abrupt changes breaking the reader\u2019s flow. Also, the font, colors, diagrams, code samples,\nanimations, and all the visual elements we include, should support the story we are telling."),(0,o.kt)("p",null,(0,o.kt)("strong",{parentName:"p"},"4. Explicit")),(0,o.kt)("p",null,"Go straight to the point without assuming the readers should know about something. Again, link internal or external resources to clarify."),(0,o.kt)("p",null,"The index of the whole content must be visible all the time so the reader knows exactly where they are and what is left."),(0,o.kt)("p",null,(0,o.kt)("strong",{parentName:"p"},"5. Attractive")),(0,o.kt)("p",null,"Our text must be nice to read, our diagrams delectable to see, and our site\u2026 a feast for the eyes!!"),(0,o.kt)("p",null,(0,o.kt)("strong",{parentName:"p"},"6. Inclusive")),(0,o.kt)("p",null,"Everybody should understand our writings, especially the topics at the top. We have arranged the documentation structure in a way that anybody can dig\ndeeper by just going down so, sections 1 to 4 must be suitable for all ages."),(0,o.kt)("p",null,"Use gender-neutral language to avoid the use of he, him, his to refer to undetermined gender. It is better to use their or they as a gender-neutral\napproach than s/he or similars."),(0,o.kt)("p",null,(0,o.kt)("strong",{parentName:"p"},"7. Cohesive")),(0,o.kt)("p",null,"Writing short and concise sentences is good, but remember to use proper connectors (\u201cTherefore\u201d, \u201cBesides\u201d, \u201cHowever\u201d, \u201cthus\u201d, etc) that provide a\nsense of continuation to the whole paragraph. If not, when people read the paragraphs, their internal voice sounds like a robot with unnatural stops."),(0,o.kt)("p",null,"For example, read this paragraph and try to hear your internal voice:"),(0,o.kt)("blockquote",null,(0,o.kt)("p",{parentName:"blockquote"},"Entities are created on the fly, by reducing the whole event stream. You shouldn't assume that they are stored anywhere. Booster does create\nautomatic snapshots to make the reduction process efficient. You are the one in charge of writing the reducer function.")),(0,o.kt)("p",null,"And now read this one:"),(0,o.kt)("blockquote",null,(0,o.kt)("p",{parentName:"blockquote"},"Entities are created on the fly by reducing the whole event stream. While you shouldn't assume that they are stored anywhere, Booster does create automatic\nsnapshots to make the reduction process efficient. In any case, this is opaque to you and the only thing you should care is to provide the reducer function.")),(0,o.kt)("p",null,"Did you feel the difference? The latter makes you feel that everything is connected, it is more cohesive."),(0,o.kt)("h5",{id:"practices"},"Practices"),(0,o.kt)("p",null,"There are many writing styles depending on the type of document. It is common within technical and scientific writing to use Inductive and/or Deductive styles\nfor paragraphs. They have different outcomes and one style may suit better in one case or another, that is why it is important to know them, and decide which\none to use in every moment. Let\u2019s see the difference with 2 recursive examples."),(0,o.kt)("p",null,(0,o.kt)("strong",{parentName:"p"},"Deductive paragraphs ease the reading for advanced users but still allows you to elaborate on ideas and concepts for newcomers"),". In deductive paragraphs,\nthe conclusions or definitions appear at the beginning, and then, details, facts, or supporting phrases complete the paragraph\u2019s idea. By placing the\nconclusion in the first sentence, the reader immediately identifies the main point so they can decide to skip the whole paragraph or keep reading.\nIf you take a look at the structure of this paragraph, it is deductive."),(0,o.kt)("p",null,"On the other hand, if you want to drive the readers' attention and play with it as if they were in a roller coaster, you can do so by using a different approach.\nIn that approach, you first introduce the facts and ideas and then you wrap them with a conclusion. This style is more narrative and forces the reader to\ncontinue because the main idea is diluted in the whole paragraph. Once all the ideas are placed together, you can finally conclude the paragraph. ",(0,o.kt)("strong",{parentName:"p"},"This style is\ncalled Inductive.")),(0,o.kt)("p",null,"The first paragraph is deductive and the last one is inductive. In general, it is better to use the deductive style, but if we stick to one, our writing will start looking weird and maybe boring.\nSo decide one or another being conscious about your intention."),(0,o.kt)("h3",{id:"create-your-very-first-github-issue"},"Create your very first GitHub issue"),(0,o.kt)("p",null,(0,o.kt)("a",{parentName:"p",href:"https://github.com/boostercloud/booster/issues/new"},"Click here")," to start making contributions to Booster."),(0,o.kt)("h2",{id:"your-first-code-contribution"},"Your First Code Contribution"),(0,o.kt)("p",null,"Unsure where to begin contributing to Booster? You can start by looking through issued tagged as ",(0,o.kt)("inlineCode",{parentName:"p"},"good-first-issue")," and ",(0,o.kt)("inlineCode",{parentName:"p"},"help-wanted"),":"),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},"Beginner issues - issues which should only require a few lines of code, and a test or two."),(0,o.kt)("li",{parentName:"ul"},"Help wanted issues - issues which should be a bit more involved than beginner issues.")),(0,o.kt)("p",null,"Both issue lists are sorted by the total number of comments. While not perfect, number of comments is a reasonable proxy for impact a given change will have."),(0,o.kt)("p",null,"Make sure that you assign the chosen issue to yourself to communicate your intention to work on it and reduce the possibilities of other people taking the same assignment."),(0,o.kt)("h3",{id:"getting-the-code"},"Getting the code"),(0,o.kt)("p",null,"To start contributing to the project you would need to set up the project in your system, to do so, you must first follow these steps in your terminal."),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("p",{parentName:"li"},"Install Rush: ",(0,o.kt)("inlineCode",{parentName:"p"},"npm install -g @microsoft/rush"))),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("p",{parentName:"li"},"Clone the repo and get into the directory of the project: ",(0,o.kt)("inlineCode",{parentName:"p"},"git clone && cd booster"))),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("p",{parentName:"li"},"Install project dependencies: ",(0,o.kt)("inlineCode",{parentName:"p"},"rush update"))),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("p",{parentName:"li"},"Compile the project ",(0,o.kt)("inlineCode",{parentName:"p"},"rush build"))),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("p",{parentName:"li"},"Add your contribution")),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("p",{parentName:"li"},"Make sure everything works by ",(0,o.kt)("a",{parentName:"p",href:"#running-unit-tests"},"executing the unit tests"),": ",(0,o.kt)("inlineCode",{parentName:"p"},"rush rest")))),(0,o.kt)("blockquote",null,(0,o.kt)("p",{parentName:"blockquote"},(0,o.kt)("strong",{parentName:"p"},"DISCLAIMER"),": The integration test process changed, feel free to chime in into our Discord for more info")),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},"Make sure everything works by ",(0,o.kt)("a",{parentName:"li",href:"#running-integration-tests"},"running the integration tests"),":")),(0,o.kt)("pre",null,(0,o.kt)("code",{parentName:"pre",className:"language-bash"},"rush pack-integration-deps\ncd packages/framework-integration-tests\nrushx integration -v\n")),(0,o.kt)("h3",{id:"understanding-the-rush-monorepo-approach-and-how-dependencies-are-structured-in-the-project"},'Understanding the "rush monorepo" approach and how dependencies are structured in the project'),(0,o.kt)("p",null,"The Booster Framework project is organized following the ",(0,o.kt)("a",{parentName:"p",href:"https://rushjs.io/"},'"rush monorepo"'),' structure. There are several "package.json" files and each one has its purpose with regard to the dependencies you include on them:'),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},'The "package.json" files that are on each package root should contain the dependencies used by that specific package. Be sure to correctly differentiate which dependency is only for development and which one is for production.')),(0,o.kt)("p",null,"Finally, ",(0,o.kt)("strong",{parentName:"p"},"always use exact numbers for dependency versions"),'. This means that if you want to add the dependency "aws-sdk" in version 1.2.3, you should add ',(0,o.kt)("inlineCode",{parentName:"p"},'"aws-sdk": "1.2.3"'),' to the corresponding "package.json" file, and never ',(0,o.kt)("inlineCode",{parentName:"p"},'"aws-sdk": "^1.2.3"')," or ",(0,o.kt)("inlineCode",{parentName:"p"},'"aws-sdk": "~1.2.3"'),". This restriction comes from hard problems we've had in the past."),(0,o.kt)("h3",{id:"running-unit-tests"},"Running unit tests"),(0,o.kt)("p",null,"Unit tests are executed when you type ",(0,o.kt)("inlineCode",{parentName:"p"},"rush test"),". If you want to run the unit tests for an especific package, you should move to the corresponding folder and run one of the following commands:"),(0,o.kt)("ul",null,(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"rushx test:cli -v"),": Run unit tests for the ",(0,o.kt)("inlineCode",{parentName:"li"},"cli")," package."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"rushx test:core -v"),": Run unit tests for the ",(0,o.kt)("inlineCode",{parentName:"li"},"framework-core")," package."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"rushx test:provider-aws -v"),": Run unit tests for the ",(0,o.kt)("inlineCode",{parentName:"li"},"framework-provider-aws")," package."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"rushx test:provider-aws-infrastructure -v"),": Run unit tests for the ",(0,o.kt)("inlineCode",{parentName:"li"},"framework-provider-aws-infrastructure")," package."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"rushx test:provider-azure -v"),": Run unit tests for the ",(0,o.kt)("inlineCode",{parentName:"li"},"framework-provider-azure")," package."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"rushx test:provider-azure-infrastructure -v"),": Run unit tests for the ",(0,o.kt)("inlineCode",{parentName:"li"},"framework-provider-azure-infrastructure")," package."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"rushx test:provider-local -v"),": Run unit tests for the ",(0,o.kt)("inlineCode",{parentName:"li"},"framework-provider-local")," package."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"rushx test:provider-local-infrastructure -v"),": Run unit tests for the ",(0,o.kt)("inlineCode",{parentName:"li"},"framework-provider-local-infrastructure")," package."),(0,o.kt)("li",{parentName:"ul"},(0,o.kt)("inlineCode",{parentName:"li"},"rushx test:types -v"),": Run unit tests for the ",(0,o.kt)("inlineCode",{parentName:"li"},"framework-types")," package.")),(0,o.kt)("h3",{id:"running-integration-tests"},"Running integration tests"),(0,o.kt)("p",null,"Integration tests are run automatically in Github Actions when a PR is locked, but it would be recommendable to run them locally before submitting a PR for review. You can find several scripts in ",(0,o.kt)("inlineCode",{parentName:"p"},"packages/framework-integration-tests/package.json")," to run different test suites. You can run them using rush tool:"),(0,o.kt)("p",null,(0,o.kt)("inlineCode",{parentName:"p"},"rushx - +
-

Booster CLI

Booster CLI is a command line interface that helps you to create, develop, and deploy your Booster applications. It is built with Node.js and published to NPM through the package @boostercloud/cli . You can install it using any compatible package manager. If you want to contribute to the project, you will also need to clone the GitHub repository and compile the source code.

Installation

The preferred way to install the Booster CLI is through NPM. You can install it following the instructions in the Node.js website.

Once you have NPM installed, you can install the Booster CLI by running this command:

npm install -g @boostercloud/cli

Usage

Once the installation is finished, you will have the boost command available in your terminal. You can run it to see the help message.

tip

You can also run boost --help to get the same output.

Command Overview

CommandDescription
new:projectCreates a new Booster project in a new directory
new:commandCreates a new command in the project
new:entityCreates a new entity in the project
new:eventCreates a new event in the project
new:event-handlerCreates a new event handler in the project
new:read-modelCreates a new read model in the project
new:scheduled-commandCreates a new scheduled command in the project
start -e <environment>Starts the project in development mode
buildBuilds the project
deploy -e <environment>Deploys the project to the cloud
nukeDeletes all the resources created by the deploy command
- +

Booster CLI

Booster CLI is a command line interface that helps you to create, develop, and deploy your Booster applications. It is built with Node.js and published to NPM through the package @boostercloud/cli . You can install it using any compatible package manager. If you want to contribute to the project, you will also need to clone the GitHub repository and compile the source code.

Installation

The preferred way to install the Booster CLI is through NPM. You can install it following the instructions in the Node.js website.

Once you have NPM installed, you can install the Booster CLI by running this command:

npm install -g @boostercloud/cli

Usage

Once the installation is finished, you will have the boost command available in your terminal. You can run it to see the help message.

tip

You can also run boost --help to get the same output.

Command Overview

CommandDescription
new:projectCreates a new Booster project in a new directory
new:commandCreates a new command in the project
new:entityCreates a new entity in the project
new:eventCreates a new event in the project
new:event-handlerCreates a new event handler in the project
new:read-modelCreates a new read model in the project
new:scheduled-commandCreates a new scheduled command in the project
start -e <environment>Starts the project in development mode
buildBuilds the project
deploy -e <environment>Deploys the project to the cloud
nukeDeletes all the resources created by the deploy command
+ \ No newline at end of file diff --git a/category/features/index.html b/category/features/index.html index 470192515..f87f64f56 100644 --- a/category/features/index.html +++ b/category/features/index.html @@ -6,13 +6,13 @@ Features | Booster Framework - + - + \ No newline at end of file diff --git a/category/getting-started/index.html b/category/getting-started/index.html index 1590499be..a5010144f 100644 --- a/category/getting-started/index.html +++ b/category/getting-started/index.html @@ -6,13 +6,13 @@ Getting Started | Booster Framework - + - + \ No newline at end of file diff --git a/category/going-deeper-with-booster/index.html b/category/going-deeper-with-booster/index.html index 95c4b9fc8..c2a5b21ba 100644 --- a/category/going-deeper-with-booster/index.html +++ b/category/going-deeper-with-booster/index.html @@ -6,13 +6,13 @@ Going deeper with Booster | Booster Framework - +

Going deeper with Booster

- + \ No newline at end of file diff --git a/contributing/index.html b/contributing/index.html index a0d4e3387..cdc3f8771 100644 --- a/contributing/index.html +++ b/contributing/index.html @@ -6,7 +6,7 @@ Contributing to Booster | Booster Framework - + @@ -41,8 +41,8 @@ In that approach, you first introduce the facts and ideas and then you wrap them with a conclusion. This style is more narrative and forces the reader to continue because the main idea is diluted in the whole paragraph. Once all the ideas are placed together, you can finally conclude the paragraph. This style is called Inductive.

The first paragraph is deductive and the last one is inductive. In general, it is better to use the deductive style, but if we stick to one, our writing will start looking weird and maybe boring. -So decide one or another being conscious about your intention.

Create your very first GitHub issue

Click here to start making contributions to Booster.

Your First Code Contribution

Unsure where to begin contributing to Booster? You can start by looking through issued tagged as good-first-issue and help-wanted:

  • Beginner issues - issues which should only require a few lines of code, and a test or two.
  • Help wanted issues - issues which should be a bit more involved than beginner issues.

Both issue lists are sorted by the total number of comments. While not perfect, number of comments is a reasonable proxy for impact a given change will have.

Make sure that you assign the chosen issue to yourself to communicate your intention to work on it and reduce the possibilities of other people taking the same assignment.

Getting the code

To start contributing to the project you would need to set up the project in your system, to do so, you must first follow these steps in your terminal.

  • Install Rush: npm install -g @microsoft/rush

  • Clone the repo and get into the directory of the project: git clone <WRITE REPO URL HERE> && cd booster

  • Install project dependencies: rush update

  • Compile the project rush build

  • Add your contribution

  • Make sure everything works by executing the unit tests: rush rest

DISCLAIMER: The integration test process changed, feel free to chime in into our Discord for more info

rush pack-integration-deps
cd packages/framework-integration-tests
rushx integration -v

Understanding the "rush monorepo" approach and how dependencies are structured in the project

The Booster Framework project is organized following the "rush monorepo" structure. There are several "package.json" files and each one has its purpose with regard to the dependencies you include on them:

  • The "package.json" files that are on each package root should contain the dependencies used by that specific package. Be sure to correctly differentiate which dependency is only for development and which one is for production.

Finally, always use exact numbers for dependency versions. This means that if you want to add the dependency "aws-sdk" in version 1.2.3, you should add "aws-sdk": "1.2.3" to the corresponding "package.json" file, and never "aws-sdk": "^1.2.3" or "aws-sdk": "~1.2.3". This restriction comes from hard problems we've had in the past.

Running unit tests

Unit tests are executed when you type rush test. If you want to run the unit tests for an especific package, you should move to the corresponding folder and run one of the following commands:

  • rushx test:cli -v: Run unit tests for the cli package.
  • rushx test:core -v: Run unit tests for the framework-core package.
  • rushx test:provider-aws -v: Run unit tests for the framework-provider-aws package.
  • rushx test:provider-aws-infrastructure -v: Run unit tests for the framework-provider-aws-infrastructure package.
  • rushx test:provider-azure -v: Run unit tests for the framework-provider-azure package.
  • rushx test:provider-azure-infrastructure -v: Run unit tests for the framework-provider-azure-infrastructure package.
  • rushx test:provider-local -v: Run unit tests for the framework-provider-local package.
  • rushx test:provider-local-infrastructure -v: Run unit tests for the framework-provider-local-infrastructure package.
  • rushx test:types -v: Run unit tests for the framework-types package.

Running integration tests

Integration tests are run automatically in Github Actions when a PR is locked, but it would be recommendable to run them locally before submitting a PR for review. You can find several scripts in packages/framework-integration-tests/package.json to run different test suites. You can run them using rush tool:

rushx <script name> -v

These are the available scripts to run integration tests:

  • rushx integration -v: Run all the integration test suites in the right order.
  • rushx integration/aws-deploy -v: This test just checks that the sample project in packages/framework-integration-tests/src can be successfully deployed to AWS. The deployment process takes several minutes and this project is used by all the other AWS integration tests, so it's a requirement to run this test before.
  • rushx integration/aws-func -v: AWS functional integration tests. They stress the deployed app write API and checks that the results are the expected ones both in the databases and the read APIs.
  • rushx integration/end-to-end -v: Runs complete and realistic use cases on several cloud providers. This tests are intended to verify that a single project can be deployed to different cloud providers. Currently, only AWS is implemented though.
  • rushx integration/aws-nuke -v: This test checks that the application deployed to AWS can be properly nuked. This test should be the last one after other test suites related to AWS have finished.
  • rushx integration/local -v: Checks that the test application can be launched locally and that the APIs and the databases behave as expected.
  • rushx integration/cli -v: Checks cli commands and check that they produce the expected results.

AWS integration tests are run in real AWS resources, so you'll need to have your AWS credentials properly set in your development machine. By default, the sample project will be deployed to your default account. Basically, if you can deploy a Booster project to AWS, you should be good to go (See more details about setting up an AWS account in the docs). Notice that while all resources used by Booster are included in the AWS free tier, running these tests in your own AWS account could incur in some expenses.

Github flow

The preferred way of accepting contributions is following the Github flow, that is, you fork the project and work in your own branch until you're happy with the work, and then submit a PR in Github.

Publishing your Pull Request

Make sure that you describe your change thoroughly in the PR body, adding references for any related issues and links to any resource that helps clarify the intent and goals of the change.

When you submit a PR to the Booster repository:

  • Unit tests will be automatically run. PRs with non-passing tests can't be merged.
  • If tests pass, your code will be reviewed by at least two core team members. Clarifications or improvements might be asked. They reserve the right to close any PR that does not meet the project quality standards, goals, or philosophy. So, discussing your plans in an issue or the Spectrum channel is always a good idea before committing to significant changes.
  • Code must be mergeable, and all conflicts must be solved before merging.
  • Once the review process is done, unit tests pass, and conflicts are fixed, you still need to make the Integration tests check to pass. You need to Lock conversation in the pull request to do that. The integration tests will run, and a new check will appear with an "In progress" status. After some time, if everything goes well, the status check will become green, and your PR is now ready to merge.

Branch naming conventions

In order to create a PR, you must create a branch from main. You should follow the GitFlow naming conventions, as detailed below:

  • feature/* - PR that implements a new feature
  • fix/* - PR that fixes a bug
  • doc/* - PR that enhances the documentation

In the right side of the branch name you can include the GitHub issue number. An example of doing this could be:

git checkout -b feature/XXX_add-an-awesome-new-feature

(where XXX is the issue number)

Commit message guidelines

The merge commit message should be structured following the conventional commits standard:

<commit type>([optional scope]): <description>

As an example:

fix(cli): Correct minor typos in code

The most important kind of commits are the ones that trigger version bumps and therefore a new release in the CI/CD system:

  • fix - patch version bump (0.0.x)
  • feat - minor version bump (0.x.0)
  • Any commit type followed by !, i.e. feat! - major version bump (x.0.0)

Apart from those previously mentioned, there are more commit types:

  • build: Changes that affect the build system or external dependencies (example scopes: rush, tsconfig, npm)
  • ci: Changes to our CI configuration files and scripts
  • docs: Documentation only changes
  • feat: A new feature
  • fix: A bug fix
  • perf: A code change that improves performance
  • refactor: A code change that neither fixes a bug nor adds a feature
  • style: Changes that do not affect the meaning of the code (white-space, formatting, missing semi-colons, etc)
  • test: Adding missing tests or correcting existing tests

We're using the following scopes in the project:

  • cli
  • core
  • types
  • integration
  • aws
  • local

Apart of using conventional commits for triggering releases, we use them to build the project changelog.

Code Style Guidelines

The Booster project comes with a nice set of ESLint config files to help you follow a consistent style, and we really encourage to use it in your editor. You can also run the rush run lint:fix commands to try solving any linter problems automatically.

For everything else, the rule of thumb is: Try to be consistent with the code around yours, and if you're not sure, ask :-)

There are some things that the linter doesn't force but are prefered this way:

Importing other files and libraries

Use import instead of require and import the objects individually when possible:

import { Object, function } from 'some-package'

Functional style

We give priority to a functional style of programming, but always taking into account how the objects are used to make sure they form a nice DSL. Classes are allowed when there's an actual state to hold, and we usually avoid default exports:

// module-a.ts, a conventional functional module
export functionA() {
...
}

export const someConstantA = 42
// module-b.ts, grouping functions with a scope
export const ModuleB = {
functionB1: () => {...},
functionB2: () => {...},
}
// object-c.ts, a class
export class ObjectC {
constructor(readonly value: number) {}
}
import { functionA, someConstantA } from 'module-a'
import { ModuleB } from 'module-b'
import { ObjectC } from 'object-c'

functionA()
ModuleB.functionB1()
const obj = new ObjectC(someConstantA)

Use const and let

Default to const and immutable objects when possible, otherwise, use let.

// Good
let a = 0
const b = 3
a = a + b

// Less Good
var c = 0
let d = 3 // Never updated
- +So decide one or another being conscious about your intention.

Create your very first GitHub issue

Click here to start making contributions to Booster.

Your First Code Contribution

Unsure where to begin contributing to Booster? You can start by looking through issued tagged as good-first-issue and help-wanted:

  • Beginner issues - issues which should only require a few lines of code, and a test or two.
  • Help wanted issues - issues which should be a bit more involved than beginner issues.

Both issue lists are sorted by the total number of comments. While not perfect, number of comments is a reasonable proxy for impact a given change will have.

Make sure that you assign the chosen issue to yourself to communicate your intention to work on it and reduce the possibilities of other people taking the same assignment.

Getting the code

To start contributing to the project you would need to set up the project in your system, to do so, you must first follow these steps in your terminal.

  • Install Rush: npm install -g @microsoft/rush

  • Clone the repo and get into the directory of the project: git clone <WRITE REPO URL HERE> && cd booster

  • Install project dependencies: rush update

  • Compile the project rush build

  • Add your contribution

  • Make sure everything works by executing the unit tests: rush rest

DISCLAIMER: The integration test process changed, feel free to chime in into our Discord for more info

rush pack-integration-deps
cd packages/framework-integration-tests
rushx integration -v

Understanding the "rush monorepo" approach and how dependencies are structured in the project

The Booster Framework project is organized following the "rush monorepo" structure. There are several "package.json" files and each one has its purpose with regard to the dependencies you include on them:

  • The "package.json" files that are on each package root should contain the dependencies used by that specific package. Be sure to correctly differentiate which dependency is only for development and which one is for production.

Finally, always use exact numbers for dependency versions. This means that if you want to add the dependency "aws-sdk" in version 1.2.3, you should add "aws-sdk": "1.2.3" to the corresponding "package.json" file, and never "aws-sdk": "^1.2.3" or "aws-sdk": "~1.2.3". This restriction comes from hard problems we've had in the past.

Running unit tests

Unit tests are executed when you type rush test. If you want to run the unit tests for an especific package, you should move to the corresponding folder and run one of the following commands:

  • rushx test:cli -v: Run unit tests for the cli package.
  • rushx test:core -v: Run unit tests for the framework-core package.
  • rushx test:provider-aws -v: Run unit tests for the framework-provider-aws package.
  • rushx test:provider-aws-infrastructure -v: Run unit tests for the framework-provider-aws-infrastructure package.
  • rushx test:provider-azure -v: Run unit tests for the framework-provider-azure package.
  • rushx test:provider-azure-infrastructure -v: Run unit tests for the framework-provider-azure-infrastructure package.
  • rushx test:provider-local -v: Run unit tests for the framework-provider-local package.
  • rushx test:provider-local-infrastructure -v: Run unit tests for the framework-provider-local-infrastructure package.
  • rushx test:types -v: Run unit tests for the framework-types package.

Running integration tests

Integration tests are run automatically in Github Actions when a PR is locked, but it would be recommendable to run them locally before submitting a PR for review. You can find several scripts in packages/framework-integration-tests/package.json to run different test suites. You can run them using rush tool:

rushx <script name> -v

These are the available scripts to run integration tests:

  • rushx integration -v: Run all the integration test suites in the right order.
  • rushx integration/aws-deploy -v: This test just checks that the sample project in packages/framework-integration-tests/src can be successfully deployed to AWS. The deployment process takes several minutes and this project is used by all the other AWS integration tests, so it's a requirement to run this test before.
  • rushx integration/aws-func -v: AWS functional integration tests. They stress the deployed app write API and checks that the results are the expected ones both in the databases and the read APIs.
  • rushx integration/end-to-end -v: Runs complete and realistic use cases on several cloud providers. This tests are intended to verify that a single project can be deployed to different cloud providers. Currently, only AWS is implemented though.
  • rushx integration/aws-nuke -v: This test checks that the application deployed to AWS can be properly nuked. This test should be the last one after other test suites related to AWS have finished.
  • rushx integration/local -v: Checks that the test application can be launched locally and that the APIs and the databases behave as expected.
  • rushx integration/cli -v: Checks cli commands and check that they produce the expected results.

AWS integration tests are run in real AWS resources, so you'll need to have your AWS credentials properly set in your development machine. By default, the sample project will be deployed to your default account. Basically, if you can deploy a Booster project to AWS, you should be good to go (See more details about setting up an AWS account in the docs). Notice that while all resources used by Booster are included in the AWS free tier, running these tests in your own AWS account could incur in some expenses.

Github flow

The preferred way of accepting contributions is following the Github flow, that is, you fork the project and work in your own branch until you're happy with the work, and then submit a PR in Github.

Publishing your Pull Request

Make sure that you describe your change thoroughly in the PR body, adding references for any related issues and links to any resource that helps clarify the intent and goals of the change.

When you submit a PR to the Booster repository:

  • Unit tests will be automatically run. PRs with non-passing tests can't be merged.
  • If tests pass, your code will be reviewed by at least two core team members. Clarifications or improvements might be asked. They reserve the right to close any PR that does not meet the project quality standards, goals, or philosophy. So, discussing your plans in an issue or the Spectrum channel is always a good idea before committing to significant changes.
  • Code must be mergeable, and all conflicts must be solved before merging.
  • Once the review process is done, unit tests pass, and conflicts are fixed, you still need to make the Integration tests check to pass. You need to Lock conversation in the pull request to do that. The integration tests will run, and a new check will appear with an "In progress" status. After some time, if everything goes well, the status check will become green, and your PR is now ready to merge.

Branch naming conventions

In order to create a PR, you must create a branch from main. You should follow the GitFlow naming conventions, as detailed below:

  • feature/* - PR that implements a new feature
  • fix/* - PR that fixes a bug
  • doc/* - PR that enhances the documentation

In the right side of the branch name you can include the GitHub issue number. An example of doing this could be:

git checkout -b feature/XXX_add-an-awesome-new-feature

(where XXX is the issue number)

Commit message guidelines

The merge commit message should be structured following the conventional commits standard:

<commit type>([optional scope]): <description>

As an example:

fix(cli): Correct minor typos in code

The most important kind of commits are the ones that trigger version bumps and therefore a new release in the CI/CD system:

  • fix - patch version bump (0.0.x)
  • feat - minor version bump (0.x.0)
  • Any commit type followed by !, i.e. feat! - major version bump (x.0.0)

Apart from those previously mentioned, there are more commit types:

  • build: Changes that affect the build system or external dependencies (example scopes: rush, tsconfig, npm)
  • ci: Changes to our CI configuration files and scripts
  • docs: Documentation only changes
  • feat: A new feature
  • fix: A bug fix
  • perf: A code change that improves performance
  • refactor: A code change that neither fixes a bug nor adds a feature
  • style: Changes that do not affect the meaning of the code (white-space, formatting, missing semi-colons, etc)
  • test: Adding missing tests or correcting existing tests

We're using the following scopes in the project:

  • cli
  • core
  • types
  • integration
  • aws
  • local

Apart of using conventional commits for triggering releases, we use them to build the project changelog.

Code Style Guidelines

The Booster project comes with a nice set of ESLint config files to help you follow a consistent style, and we really encourage to use it in your editor. You can also run the rush run lint:fix commands to try solving any linter problems automatically.

For everything else, the rule of thumb is: Try to be consistent with the code around yours, and if you're not sure, ask :-)

There are some things that the linter doesn't force but are prefered this way:

Importing other files and libraries

Use import instead of require and import the objects individually when possible:

import { Object, function } from 'some-package'

Functional style

We give priority to a functional style of programming, but always taking into account how the objects are used to make sure they form a nice DSL. Classes are allowed when there's an actual state to hold, and we usually avoid default exports:

// module-a.ts, a conventional functional module
export functionA() {
...
}

export const someConstantA = 42
// module-b.ts, grouping functions with a scope
export const ModuleB = {
functionB1: () => {...},
functionB2: () => {...},
}
// object-c.ts, a class
export class ObjectC {
constructor(readonly value: number) {}
}
import { functionA, someConstantA } from 'module-a'
import { ModuleB } from 'module-b'
import { ObjectC } from 'object-c'

functionA()
ModuleB.functionB1()
const obj = new ObjectC(someConstantA)

Use const and let

Default to const and immutable objects when possible, otherwise, use let.

// Good
let a = 0
const b = 3
a = a + b

// Less Good
var c = 0
let d = 3 // Never updated
+ \ No newline at end of file diff --git a/features/error-handling/index.html b/features/error-handling/index.html index 8c160b1a3..359dbbb3c 100644 --- a/features/error-handling/index.html +++ b/features/error-handling/index.html @@ -6,13 +6,13 @@ Error handling | Booster Framework - +
-

Error handling

Error handling in Booster

Booster provides a default error handling mechanism that will try to catch all the errors that are thrown in the application and will log them. This is useful for debugging purposes, but you may want to customize the error handling in your application. For example, you may want to send an email to the administrator when an error occurs.

Custom error handling

To customize the error handling, you need to create a class decorated with the @GlobalErrorHandler decorator. This class will contain the methods that will be called when an error is thrown. There is one method for each component in the application where an error can be thrown. All these functions receive the error that was thrown and the information about the component that was being executed when the error occurred.

They must return a promise that resolves to an Error object or undefined. If the promise resolves to undefined, the error will be ignored and the application will continue working. If the promise resolves to an Error object, the error will be thrown.

Command handle errors

These are the errors that are thrown in the handle method of the @Command. You can catch and return new errors in this method annotating a class with @GlobalErrorHandler and implementing the following method:

@GlobalErrorHandler()
export class MyErrorHandler {
public onCommandHandlerError(error: Error, command: CommandEnvelope): Promise<Error | undefined> {
// Do something with the error
}
}

Tis method receives the error that was thrown and the command that was being handled when the error occurred.

Scheduled command handle errors

These are the errors that are thrown in the handle method of the @ScheduledCommand. You can catch and return new errors in this function annotating a class with @GlobalErrorHandler and implementing the following method:

@GlobalErrorHandler()
export class MyErrorHandler {
public onScheduledCommandHandlerError(error: Error): Promise<Error | undefined> {
// Do something with the error
}
}

This method receives the error that was thrown.

Event handler errors

These are the errors that are thrown in the handle method of the @Event. You can catch and return new errors in this function annotating a class with @GlobalErrorHandler and implementing the following method:

@GlobalErrorHandler()
export class MyErrorHandler {
public onEventHandlerError(error: Error, event: EventEnvelope): Promise<Error | undefined> {
// Do something with the error
}
}

This method receives the error that was thrown and the event that was being handled when the error occurred.

Reducer errors

These are the errors that are thrown in the @Reduces method of the @Entity. You can catch and return new errors in this function annotating a class with @GlobalErrorHandler and implementing the following method:

@GlobalErrorHandler()
export class MyErrorHandler {
public onReducerError(error: Error, entity: EntityInterface, snapshot: EntityInterface | null): Promise<Error | undefined> {
// Do something with the error
}
}

This method receives the error that was thrown, the name of the entity, the ID of the entity, and the name of the reducer.

Projection errors

These are the errors that are thrown in the @Projects method of the @ReadModel. You can catch and return new errors in this function annotating a class with @GlobalErrorHandler and implementing the following method:

@GlobalErrorHandler()
export class MyErrorHandler {
public onProjectionError(error: Error, readModel: ReadModelInterface, entity: EntityInterface): Promise<Error | undefined> {
// Do something with the error
}
}

This method receives the error that was thrown, the name of the read model, the ID of the read model, and the name of the projection.

All errors

These are the errors that are thrown in any of the previous methods. You can catch and return new errors in this function annotating a class with @GlobalErrorHandler and implementing the following method:

@GlobalErrorHandler()
export class MyErrorHandler {
public onError(error: Error | undefined): Promise<Error | undefined> {
// Do something with the error
}
}

This method receives the error that was thrown.

Global error handler example

You can implement all error handling functions in the same class. Here is an example of a global error handler that will handle all the errors mentioned above:

@GlobalErrorHandler()
export class AppErrorHandler {
public static async onCommandHandlerError(error: Error, command: CommandEnvelope): Promise<Error | undefined> {
return error
}

public static async onScheduledCommandHandlerError(error: Error): Promise<Error | undefined> {
return error
}

public static async onDispatchEventHandlerError(error: Error, eventInstance: EventInterface): Promise<Error | undefined> {
return error
}

public static async onReducerError(
error: Error,
eventInstance: EventInterface,
snapshotInstance: EntityInterface | null
): Promise<Error | undefined> {
return error
}

public static async onProjectionError(
error: Error,
entity: EntityInterface,
readModel: ReadModelInterface | undefined
): Promise<Error | undefined> {
return error
}

public static async onError(error: Error | undefined): Promise<Error | undefined> {
return error
}
}
- +

Error handling

Error handling in Booster

Booster provides a default error handling mechanism that will try to catch all the errors that are thrown in the application and will log them. This is useful for debugging purposes, but you may want to customize the error handling in your application. For example, you may want to send an email to the administrator when an error occurs.

Custom error handling

To customize the error handling, you need to create a class decorated with the @GlobalErrorHandler decorator. This class will contain the methods that will be called when an error is thrown. There is one method for each component in the application where an error can be thrown. All these functions receive the error that was thrown and the information about the component that was being executed when the error occurred.

They must return a promise that resolves to an Error object or undefined. If the promise resolves to undefined, the error will be ignored and the application will continue working. If the promise resolves to an Error object, the error will be thrown.

Command handle errors

These are the errors that are thrown in the handle method of the @Command. You can catch and return new errors in this method annotating a class with @GlobalErrorHandler and implementing the following method:

@GlobalErrorHandler()
export class MyErrorHandler {
public onCommandHandlerError(error: Error, command: CommandEnvelope): Promise<Error | undefined> {
// Do something with the error
}
}

Tis method receives the error that was thrown and the command that was being handled when the error occurred.

Scheduled command handle errors

These are the errors that are thrown in the handle method of the @ScheduledCommand. You can catch and return new errors in this function annotating a class with @GlobalErrorHandler and implementing the following method:

@GlobalErrorHandler()
export class MyErrorHandler {
public onScheduledCommandHandlerError(error: Error): Promise<Error | undefined> {
// Do something with the error
}
}

This method receives the error that was thrown.

Event handler errors

These are the errors that are thrown in the handle method of the @Event. You can catch and return new errors in this function annotating a class with @GlobalErrorHandler and implementing the following method:

@GlobalErrorHandler()
export class MyErrorHandler {
public onEventHandlerError(error: Error, event: EventEnvelope): Promise<Error | undefined> {
// Do something with the error
}
}

This method receives the error that was thrown and the event that was being handled when the error occurred.

Reducer errors

These are the errors that are thrown in the @Reduces method of the @Entity. You can catch and return new errors in this function annotating a class with @GlobalErrorHandler and implementing the following method:

@GlobalErrorHandler()
export class MyErrorHandler {
public onReducerError(error: Error, entity: EntityInterface, snapshot: EntityInterface | null): Promise<Error | undefined> {
// Do something with the error
}
}

This method receives the error that was thrown, the name of the entity, the ID of the entity, and the name of the reducer.

Projection errors

These are the errors that are thrown in the @Projects method of the @ReadModel. You can catch and return new errors in this function annotating a class with @GlobalErrorHandler and implementing the following method:

@GlobalErrorHandler()
export class MyErrorHandler {
public onProjectionError(error: Error, readModel: ReadModelInterface, entity: EntityInterface): Promise<Error | undefined> {
// Do something with the error
}
}

This method receives the error that was thrown, the name of the read model, the ID of the read model, and the name of the projection.

All errors

These are the errors that are thrown in any of the previous methods. You can catch and return new errors in this function annotating a class with @GlobalErrorHandler and implementing the following method:

@GlobalErrorHandler()
export class MyErrorHandler {
public onError(error: Error | undefined): Promise<Error | undefined> {
// Do something with the error
}
}

This method receives the error that was thrown.

Global error handler example

You can implement all error handling functions in the same class. Here is an example of a global error handler that will handle all the errors mentioned above:

@GlobalErrorHandler()
export class AppErrorHandler {
public static async onCommandHandlerError(error: Error, command: CommandEnvelope): Promise<Error | undefined> {
return error
}

public static async onScheduledCommandHandlerError(error: Error): Promise<Error | undefined> {
return error
}

public static async onDispatchEventHandlerError(error: Error, eventInstance: EventInterface): Promise<Error | undefined> {
return error
}

public static async onReducerError(
error: Error,
eventInstance: EventInterface,
snapshotInstance: EntityInterface | null
): Promise<Error | undefined> {
return error
}

public static async onProjectionError(
error: Error,
entity: EntityInterface,
readModel: ReadModelInterface | undefined
): Promise<Error | undefined> {
return error
}

public static async onError(error: Error | undefined): Promise<Error | undefined> {
return error
}
}
+ \ No newline at end of file diff --git a/features/event-stream/index.html b/features/event-stream/index.html index fcc33bf42..06cc9871a 100644 --- a/features/event-stream/index.html +++ b/features/event-stream/index.html @@ -6,13 +6,13 @@ The event stream | Booster Framework - +
-

The event stream

The event stream API is a read-only API that allows you to fetch the events that have been generated by your application. It's useful for debugging purposes, but also for building your own analytics tools. The access to this API is disabled by default, but you can enable it by configuring the authorizeReadEvents policy in your entities. You can also use the authorizeReadEvents policy to restrict access to the events of certain entities.

Accessing the event streams API

The authorizeReadEvents policy can be configured with any of the supported authorization mechanisms:

  • 'all' to make them public.
  • an array of roles.
  • An authorizer function that matches the EventStreamAuthorizer type signature. For example
@Entity({
authorizeReadEvents: 'all', // Anyone can read any Cart's event
})
export class Cart {
public constructor(
readonly id: UUID,
readonly cartItems: Array<CartItem>,
public shippingAddress?: Address,
public checks = 0
) {}
// <reducers...>
}
note

Be careful when exposing events data, as this data is likely to hold internal system state. Pay special attention when authorizing public access with the 'all' option, it's always recommended to look for alternate solutions that limit access.

To read more about how to restrict the access to the event stream API, check out the authorization guide.

- +

The event stream

The event stream API is a read-only API that allows you to fetch the events that have been generated by your application. It's useful for debugging purposes, but also for building your own analytics tools. The access to this API is disabled by default, but you can enable it by configuring the authorizeReadEvents policy in your entities. You can also use the authorizeReadEvents policy to restrict access to the events of certain entities.

Accessing the event streams API

The authorizeReadEvents policy can be configured with any of the supported authorization mechanisms:

  • 'all' to make them public.
  • an array of roles.
  • An authorizer function that matches the EventStreamAuthorizer type signature. For example
@Entity({
authorizeReadEvents: 'all', // Anyone can read any Cart's event
})
export class Cart {
public constructor(
readonly id: UUID,
readonly cartItems: Array<CartItem>,
public shippingAddress?: Address,
public checks = 0
) {}
// <reducers...>
}
note

Be careful when exposing events data, as this data is likely to hold internal system state. Pay special attention when authorizing public access with the 'all' option, it's always recommended to look for alternate solutions that limit access.

To read more about how to restrict the access to the event stream API, check out the authorization guide.

+ \ No newline at end of file diff --git a/features/logging/index.html b/features/logging/index.html index 65b573e2a..798fe659d 100644 --- a/features/logging/index.html +++ b/features/logging/index.html @@ -6,13 +6,13 @@ Logging in Booster | Booster Framework - +
-

Logging in Booster

If no configuration is provided, Booster uses the default JavaScript logging capabilities. Depending on the log level, it will call different logging methods:

  • console.debug for Level.debug
  • console.info for Level.info
  • console.warn for Level.warn
  • console.error for Level.error

In this regard, there's no distinction from any other node process and you'll find the logs in your cloud provider's default log aggregator (i.e. Cloudwatch if you use AWS).

Advanced logging

You may need some advanced logging capabilities, such as redirecting your logs to a log aggregator. Booster also supports overriding the default behavior by providing custom loggers. The only thing you need to do is to provide an object that implements the Logger interface at config time:

@boostercloud/framework-types/lib/logger.ts
interface Logger {
debug(message?: any, ...optionalParams: any[]): void
info(message?: any, ...optionalParams: any[]): void
warn(message?: any, ...optionalParams: any[]): void
error(message?: any, ...optionalParams: any[]): void
}
src/config/config.ts
Booster.configure('development', (config: BoosterConfig): void => {
config.appName = 'my-store'
config.providerPackage = '@boostercloud/framework-provider-aws'
config.logger = new MyCustomLogger() // Overrides the default logger object
config.logLevel = Level.debug // Sets the log level at 'debug'
config.logPrefix = 'my-store-dev' // Sets the default prefix
})

Using the Booster's logger

All framework's components will use this logger by default and will generate logs that match the following pattern:

[<logPrefix>]|moduleName: <message>

You can get a custom logger instance that extends the configured logger by adding your moduleName and optionally overriding the configured prefix with the getLogger helper function. It's a good practice to build and use a separate logger instance built with this method for each context, as this will make it easier to filter your logs when you need to investigate a problem.

Example: Obtaining a logger for your command:

@Command({
authorize: [User],
})
export class UpdateShippingAddress {
public constructor(readonly cartId: UUID, readonly address: Address) {}

public static async handle(command: UpdateShippingAddress, register: Register): Promise<void> {
const logger = getLogger(Booster.config, 'UpdateShippingCommand#handler', 'MyApp')
logger.debug(`User ${register.currentUser?.username} changed shipping address for cart ${command.cartId}: ${JSON.stringify(command.address}`)
register.events(new ShippingAddressUpdated(command.cartId, command.address))
}
}

When a UpdateShippingAddress command is handled, it wil log messages that look like the following:

[MyApp]|UpdateShippingCommand#handler: User buyer42 changed shipping address for cart 314: { street: '13th rue del percebe', number: 6, ... }
info

Using the configured Booster logger is not mandatory for your application, but it might be convenient to centralize your logs and this is a standard way to do it.

- +

Logging in Booster

If no configuration is provided, Booster uses the default JavaScript logging capabilities. Depending on the log level, it will call different logging methods:

  • console.debug for Level.debug
  • console.info for Level.info
  • console.warn for Level.warn
  • console.error for Level.error

In this regard, there's no distinction from any other node process and you'll find the logs in your cloud provider's default log aggregator (i.e. Cloudwatch if you use AWS).

Advanced logging

You may need some advanced logging capabilities, such as redirecting your logs to a log aggregator. Booster also supports overriding the default behavior by providing custom loggers. The only thing you need to do is to provide an object that implements the Logger interface at config time:

@boostercloud/framework-types/lib/logger.ts
interface Logger {
debug(message?: any, ...optionalParams: any[]): void
info(message?: any, ...optionalParams: any[]): void
warn(message?: any, ...optionalParams: any[]): void
error(message?: any, ...optionalParams: any[]): void
}
src/config/config.ts
Booster.configure('development', (config: BoosterConfig): void => {
config.appName = 'my-store'
config.providerPackage = '@boostercloud/framework-provider-aws'
config.logger = new MyCustomLogger() // Overrides the default logger object
config.logLevel = Level.debug // Sets the log level at 'debug'
config.logPrefix = 'my-store-dev' // Sets the default prefix
})

Using the Booster's logger

All framework's components will use this logger by default and will generate logs that match the following pattern:

[<logPrefix>]|moduleName: <message>

You can get a custom logger instance that extends the configured logger by adding your moduleName and optionally overriding the configured prefix with the getLogger helper function. It's a good practice to build and use a separate logger instance built with this method for each context, as this will make it easier to filter your logs when you need to investigate a problem.

Example: Obtaining a logger for your command:

@Command({
authorize: [User],
})
export class UpdateShippingAddress {
public constructor(readonly cartId: UUID, readonly address: Address) {}

public static async handle(command: UpdateShippingAddress, register: Register): Promise<void> {
const logger = getLogger(Booster.config, 'UpdateShippingCommand#handler', 'MyApp')
logger.debug(`User ${register.currentUser?.username} changed shipping address for cart ${command.cartId}: ${JSON.stringify(command.address}`)
register.events(new ShippingAddressUpdated(command.cartId, command.address))
}
}

When a UpdateShippingAddress command is handled, it wil log messages that look like the following:

[MyApp]|UpdateShippingCommand#handler: User buyer42 changed shipping address for cart 314: { street: '13th rue del percebe', number: 6, ... }
info

Using the configured Booster logger is not mandatory for your application, but it might be convenient to centralize your logs and this is a standard way to do it.

+ \ No newline at end of file diff --git a/features/schedule-actions/index.html b/features/schedule-actions/index.html index 4075bdde4..b4cd5b601 100644 --- a/features/schedule-actions/index.html +++ b/features/schedule-actions/index.html @@ -6,13 +6,13 @@ Schedule actions | Booster Framework - +
-

Schedule actions

There are many cases in which you want to trigger some action periodically. For example, you may want to send a reminder email to a user every day at 10:00 AM. For this, you can use scheduled commands.

Scheduled command

Commands represent an action. Therefore, the way to trigger an action periodically is by scheduling a command. Scheduled commands are the way to add automated tasks to your application, like cron jobs in other frameworks. Booster scheduled commands are TypeScript classes decorated with @ScheduledCommand. Unlike conventional commands, the handle function doesn't have any parameters.

In Booster, a scheduled command looks like this:

@ScheduledCommand({
minute: '0/5', // runs every 5 minutes
})
export class CheckCartCount {
public static async handle(): Promise<void> {
/* YOUR CODE HERE */
}
}

You can pass the following parameters to the @ScheduledCommand decorator:

  • minute: A cron expression to specify the minute(s) in which the command will be triggered. For example, 0/5 means "every 5 minutes". You can also use a comma-separated list of values, like 1,5,10,15,20,25,30,35,40,45,50,55 to specify a list of minutes.
  • hour: A cron expression to specify the hour(s) in which the command will be triggered. For example, 0/2 means "every 2 hours". You can also use a comma-separated list of values, like 1,3,5,7,9,11,13,15,17,19,21,23 to specify a list of hours.
  • day: A cron expression to specify the day(s) in which the command will be triggered. For example, 1/2 means "every 2 days". You can also use a comma-separated list of values, like 1,3,5,7,9,11,13,15,17,19,21,23,25,27,29,31 to specify a list of days.
  • month: A cron expression to specify the month(s) in which the command will be triggered. For example, 1/2 means "every 2 months". You can also use a comma-separated list of values, like 1,3,5,7,9,11 to specify a list of months.
  • weekDay: A cron expression to specify the day(s) of the week in which the command will be triggered. For example, 1/2 means "every 2 days of the week". You can also use a comma-separated list of values, like 1,3,5 to specify a list of days of the week.
  • year: A cron expression to specify the year(s) in which the command will be triggered. For example, 2020/2 means "every 2 years". You can also use a comma-separated list of values, like 2020,2022,2024,2026,2028,2030 to specify a list of years.

By default, if no paramaters are passed, the scheduled command will not be triggered.

Creating a scheduled command

The preferred way to create a scheduled command is by using the generator, e.g.

boost new:scheduled-command CheckCartCount
- +

Schedule actions

There are many cases in which you want to trigger some action periodically. For example, you may want to send a reminder email to a user every day at 10:00 AM. For this, you can use scheduled commands.

Scheduled command

Commands represent an action. Therefore, the way to trigger an action periodically is by scheduling a command. Scheduled commands are the way to add automated tasks to your application, like cron jobs in other frameworks. Booster scheduled commands are TypeScript classes decorated with @ScheduledCommand. Unlike conventional commands, the handle function doesn't have any parameters.

In Booster, a scheduled command looks like this:

@ScheduledCommand({
minute: '0/5', // runs every 5 minutes
})
export class CheckCartCount {
public static async handle(): Promise<void> {
/* YOUR CODE HERE */
}
}

You can pass the following parameters to the @ScheduledCommand decorator:

  • minute: A cron expression to specify the minute(s) in which the command will be triggered. For example, 0/5 means "every 5 minutes". You can also use a comma-separated list of values, like 1,5,10,15,20,25,30,35,40,45,50,55 to specify a list of minutes.
  • hour: A cron expression to specify the hour(s) in which the command will be triggered. For example, 0/2 means "every 2 hours". You can also use a comma-separated list of values, like 1,3,5,7,9,11,13,15,17,19,21,23 to specify a list of hours.
  • day: A cron expression to specify the day(s) in which the command will be triggered. For example, 1/2 means "every 2 days". You can also use a comma-separated list of values, like 1,3,5,7,9,11,13,15,17,19,21,23,25,27,29,31 to specify a list of days.
  • month: A cron expression to specify the month(s) in which the command will be triggered. For example, 1/2 means "every 2 months". You can also use a comma-separated list of values, like 1,3,5,7,9,11 to specify a list of months.
  • weekDay: A cron expression to specify the day(s) of the week in which the command will be triggered. For example, 1/2 means "every 2 days of the week". You can also use a comma-separated list of values, like 1,3,5 to specify a list of days of the week.
  • year: A cron expression to specify the year(s) in which the command will be triggered. For example, 2020/2 means "every 2 years". You can also use a comma-separated list of values, like 2020,2022,2024,2026,2028,2030 to specify a list of years.

By default, if no paramaters are passed, the scheduled command will not be triggered.

Creating a scheduled command

The preferred way to create a scheduled command is by using the generator, e.g.

boost new:scheduled-command CheckCartCount
+ \ No newline at end of file diff --git a/frequently-asked-questions/index.html b/frequently-asked-questions/index.html index 3d68748c5..8244eb632 100644 --- a/frequently-asked-questions/index.html +++ b/frequently-asked-questions/index.html @@ -6,13 +6,13 @@ Frequently Asked Questions | Booster Framework - +
-

Frequently Asked Questions

1.- When deploying my application in AWS for the first time, I got an error saying "StagingBucket your app name -toolkit-bucket already exists"

When you deploy a Booster application to AWS, an S3 bucket needs to be created to upload the application code. Booster names that bucket using your application name as a prefix. In AWS, bucket names must be unique globally, so if there is another bucket in the world with exactly the same name as the one generated for your application, you will get this error.

The solution is to change your application name in the configuration file so that the bucket name is unique.

2.- I tried following the video guide but the function Booster.fetchEntitySnapshot is not found in BoostApp.

The function Booster.fetchEntitySnapshot was renamed to Booster.entity, so please replace it when following old tutorials.

- +

Frequently Asked Questions

1.- When deploying my application in AWS for the first time, I got an error saying "StagingBucket your app name -toolkit-bucket already exists"

When you deploy a Booster application to AWS, an S3 bucket needs to be created to upload the application code. Booster names that bucket using your application name as a prefix. In AWS, bucket names must be unique globally, so if there is another bucket in the world with exactly the same name as the one generated for your application, you will get this error.

The solution is to change your application name in the configuration file so that the bucket name is unique.

2.- I tried following the video guide but the function Booster.fetchEntitySnapshot is not found in BoostApp.

The function Booster.fetchEntitySnapshot was renamed to Booster.entity, so please replace it when following old tutorials.

+ \ No newline at end of file diff --git a/getting-started/coding/index.html b/getting-started/coding/index.html index 86bc31641..267e864c3 100644 --- a/getting-started/coding/index.html +++ b/getting-started/coding/index.html @@ -6,7 +6,7 @@ Build a Booster app in minutes | Booster Framework - + @@ -62,8 +62,8 @@ we run new:project CLI command.

> boost nuke -e production

? Please, enter the app name to confirm deletion of all resources: boosted-blog

Congratulations! You've built a serverless backend in less than 10 minutes. We hope you have enjoyed discovering the magic of the Booster Framework.

9. More functionalities

This is a really basic example of a Booster application. The are many other features Booster provides like:

  • Use a more complex authorization schema for commands and read models based on user roles
  • Use GraphQL subscriptions to get updates in real-time
  • Make events trigger other events
  • Deploy static content
  • Reading entities within command handlers to apply domain-driven decisions
  • And much more...

Continue reading to dig more. You've just scratched the surface of all the Booster capabilities!

Examples and walkthroughs

Creation of a question-asking application backend

In the following video, you will find how to create a backend for a question-asking application from scratch. This application would allow users to create questions and like them. This video goes from creating the project to incrementally deploying features in the application. -You can find the code both for the frontend and the backend in this GitHub repo.

All the guides and examples

Check out the example apps repository to see Booster in use.

- +You can find the code both for the frontend and the backend in this GitHub repo.

All the guides and examples

Check out the example apps repository to see Booster in use.

+ \ No newline at end of file diff --git a/getting-started/installation/index.html b/getting-started/installation/index.html index 513a43ecd..8ac4783bf 100644 --- a/getting-started/installation/index.html +++ b/getting-started/installation/index.html @@ -6,7 +6,7 @@ Installation | Booster Framework - + @@ -23,8 +23,8 @@ the stable versions are published to npm, these versions are the recommended ones, as they are well documented, and the changes are stated in the release notes.

To install the Booster CLI run this:

npm install --global @boostercloud/cli

Verify the Booster CLI installation with the boost version command. You should get back -something like

boost version

@boostercloud/cli/0.16.1 darwin-x64 node-v14.14.0

- +something like

boost version

@boostercloud/cli/0.16.1 darwin-x64 node-v14.14.0

+ \ No newline at end of file diff --git a/going-deeper/custom-providers/index.html b/going-deeper/custom-providers/index.html index 853fb55a0..f95b35f26 100644 --- a/going-deeper/custom-providers/index.html +++ b/going-deeper/custom-providers/index.html @@ -6,7 +6,7 @@ Create custom providers | Booster Framework - + @@ -18,8 +18,8 @@ As it has been commented, this package includes all the necessary to deploy and configure cloud elements for running your application. For instance in the case of AWS, this package is in charge of deploy the DynamoDB for your event store, create all the lambdas, and configure all the API gateway configuration for your application.

The infrastructure package interface is composed of four methods:

    export interface ProviderInfrastructure {
deploy?: (config: BoosterConfig) => Promise<void>
nuke?: (config: BoosterConfig) => Promise<void>
start?: (config: BoosterConfig, port: number) => Promise<void>
synth?: (config: BoosterConfig) => Promise<void>
}
  • deploy: This method is called during the deployment by the CLI and it should be in charge to deploy all the neccesary resource for your application and rockets.
  • nuke: This is method is charge of destroy all generated resources during the deploy and it is called during the nuke process.
  • start: This method is used when the provider implements a server that needs to be started (i.e. the local provider)
  • synth: This method allows you to export the infrastructure to a file (for instance, if you use the Terraform CDK, you can export the script here to run it using conventional terraform tools)

The infrastructure interface just defines an adapter for Booster so the framework knows how to start any of the described processes, but you can use any Infrastructure as Code tool that has a Typescript DSL (CDK) or even call other CLI tools or scripts if you rather maintain it using a different technology.

The runtime interface in detail

The other key aspect during the implementation of a provider is the runtime package. This package is in charge of the interaction between Booster framework and all deployed resources when the application is running. -For instance, this package has the responsability to store data in the event store, performs the data projections, etc...

The runtime interface (ProviderLibrary) is divided in seven sections:

Events

This section is in charge of all operations related to events. the methods of this section are the following:

export interface ProviderEventsLibrary {
rawToEnvelopes(rawEvents: unknown): Array<EventEnvelope>
forEntitySince(config: BoosterConfig, entityTypeName: string, entityID: UUID, since?: string): Promise<Array<EventEnvelope>>
latestEntitySnapshot(config: BoosterConfig, entityTypeName: string, entityID: UUID): Promise<EventEnvelope | null>
search(config: BoosterConfig, parameters: EventSearchParameters): Promise<Array<EventSearchResponse>>
store(eventEnvelopes: Array<EventEnvelope>, config: BoosterConfig): Promise<void>
searchEntitiesIDs(config: BoosterConfig, limit: number, afterCursor: Record<string, string> | undefined, entityTypeName: string): Promise<PaginatedEntitiesIdsResult>
}
  • rawToEnvelopes: Inside the framework all user application data is processed encapsulated in an envelope object. This particular function performs the transformation from the used database data into a Booster framework envelope object.
  • forEntitySince: This method have to returns all the events associated with an specific entity.
  • latestEntitySnapshot: With this method the framework should be able to obtains the latest snapshot for an specific entity.
  • search: This method receives a query and it should perform it in the database used by the provider and return the result.
  • store: This method is used to store new events in the database.
  • searchEntitiesIDs: This method is used for implementing the pagination in searches.

Read Models

This section of the interface provides to the framework the ability to interact with the database to manage read models thanks to the following methods:

export interface ProviderReadModelsLibrary {
rawToEnvelopes(config: BoosterConfig, rawEvents: unknown): Promise<Array<ReadModelEnvelope>>
fetch(config: BoosterConfig, readModelName: string, readModelID: UUID, sequenceKey?: SequenceKey): Promise<ReadOnlyNonEmptyArray<ReadModelInterface>>
search<TReadModel extends ReadModelInterface>(config: BoosterConfig, entityTypeName: string, filters: FilterFor<unknown>, sortBy?: SortFor<unknown>, limit?: number, afterCursor?: unknown, paginatedVersion?: boolean): Promise<Array<TReadModel> | ReadModelListResult<TReadModel>>
store(config: BoosterConfig, readModelName: string, readModel: ReadModelInterface, expectedCurrentVersion?: number): Promise<unknown>
delete(config: BoosterConfig, readModelName: string, readModel: ReadModelInterface | undefined): Promise<any>
subscribe(config: BoosterConfig, subscriptionEnvelope: SubscriptionEnvelope): Promise<void>
fetchSubscriptions(config: BoosterConfig, subscriptionName: string): Promise<Array<SubscriptionEnvelope>>
deleteSubscription(config: BoosterConfig, connectionID: string, subscriptionID: string): Promise<void>
deleteAllSubscriptions(config: BoosterConfig, connectionID: string): Promise<void>
  • rawToEnvelopes: This method is used to transform all database data into read models envelopes.
  • fetch: Fetch a specific read model from the database.
  • search: This method receives a search query and it should return the read model search result.
  • store: Save a new read model projection on the database.
  • delete: Delete a read model from the database.
  • subscribe: This method is used to susbcribe a client to an specific read model.
  • fetchSubscriptions: Get the list of all clients subscribed to a specific read model.
  • deleteSubscription: Delete a specific read model subscription.
  • deleteAllSubscriptions: Delete all subscription for a specific read model.

GraphQL

This section of the API provides all necessary to receive and return GraphQL query from client side and create the return for requests:

export interface ProviderGraphQLLibrary {
rawToEnvelope(config: BoosterConfig, rawGraphQLRequest: unknown): Promise<GraphQLRequestEnvelope | GraphQLRequestEnvelopeError>
handleResult(result?: unknown, headers?: Record<string, string>): Promise<unknown>
}
  • rawToEnvelope: This method receives the request from the client with the GraphQL query and it should return the envelope object for the GraphQL query
  • handleResult This method receives the GraphQL results and it should return the response object for the client.

API responses

General API response management:

export interface ProviderAPIHandling {
requestSucceeded(body?: unknown, headers?: Record<string, number | string | ReadonlyArray<string>>): Promise<unknown>
requestFailed(error: Error): Promise<unknown>
}
  • requestSucceeded: This is a general method for processing sucess responses.
  • requestFailed: This is a general method for processing error responses.

Connections

This section of the API is in charge of the connection management for subscription at API gateway level:

export interface ProviderConnectionsLibrary {
storeData(config: BoosterConfig, connectionID: string, data: ConnectionDataEnvelope): Promise<void>
fetchData(config: BoosterConfig, connectionID: string): Promise<ConnectionDataEnvelope | undefined>
deleteData(config: BoosterConfig, connectionID: string): Promise<void>
sendMessage(config: BoosterConfig, connectionID: string, data: unknown): Promise<void>
}
  • storeData: This method receives all the information about the incoming connection and it should store the data on a database.
  • fetchData: Fetch the specific client connection information from the database.
  • deleteData: Delete all the information about a specific client connection.
  • sendMessage: Send a message to a specific client. This method get the message and destination as parameters and it should be able to fetch the connection information from the database and send the provided data to the client.

Scheduled

Finally, this section of the API is related to scheduled commands:

export interface ScheduledCommandsLibrary {
rawToEnvelope(config: BoosterConfig, rawMessage: unknown): Promise<ScheduledCommandEnvelope>
}
  • rawToEnvelope: as in other sections, this method is in charge to transform the scheduled command into a framework envelope.

Tips for developing custom providers

  • As a starting point, check the implementation of other providers to check how evertyhing is implemented.
  • Start the provider implementation by the infrastructure package because you will get all the infrastructure deployed and later the work with the runtime API will be easier.
  • If you need support during the development remember that you can have access to our Discord where some community members will can help you.
- +For instance, this package has the responsability to store data in the event store, performs the data projections, etc...

The runtime interface (ProviderLibrary) is divided in seven sections:

Events

This section is in charge of all operations related to events. the methods of this section are the following:

export interface ProviderEventsLibrary {
rawToEnvelopes(rawEvents: unknown): Array<EventEnvelope>
forEntitySince(config: BoosterConfig, entityTypeName: string, entityID: UUID, since?: string): Promise<Array<EventEnvelope>>
latestEntitySnapshot(config: BoosterConfig, entityTypeName: string, entityID: UUID): Promise<EventEnvelope | null>
search(config: BoosterConfig, parameters: EventSearchParameters): Promise<Array<EventSearchResponse>>
store(eventEnvelopes: Array<EventEnvelope>, config: BoosterConfig): Promise<void>
searchEntitiesIDs(config: BoosterConfig, limit: number, afterCursor: Record<string, string> | undefined, entityTypeName: string): Promise<PaginatedEntitiesIdsResult>
}
  • rawToEnvelopes: Inside the framework all user application data is processed encapsulated in an envelope object. This particular function performs the transformation from the used database data into a Booster framework envelope object.
  • forEntitySince: This method have to returns all the events associated with an specific entity.
  • latestEntitySnapshot: With this method the framework should be able to obtains the latest snapshot for an specific entity.
  • search: This method receives a query and it should perform it in the database used by the provider and return the result.
  • store: This method is used to store new events in the database.
  • searchEntitiesIDs: This method is used for implementing the pagination in searches.

Read Models

This section of the interface provides to the framework the ability to interact with the database to manage read models thanks to the following methods:

export interface ProviderReadModelsLibrary {
rawToEnvelopes(config: BoosterConfig, rawEvents: unknown): Promise<Array<ReadModelEnvelope>>
fetch(config: BoosterConfig, readModelName: string, readModelID: UUID, sequenceKey?: SequenceKey): Promise<ReadOnlyNonEmptyArray<ReadModelInterface>>
search<TReadModel extends ReadModelInterface>(config: BoosterConfig, entityTypeName: string, filters: FilterFor<unknown>, sortBy?: SortFor<unknown>, limit?: number, afterCursor?: unknown, paginatedVersion?: boolean): Promise<Array<TReadModel> | ReadModelListResult<TReadModel>>
store(config: BoosterConfig, readModelName: string, readModel: ReadModelInterface, expectedCurrentVersion?: number): Promise<unknown>
delete(config: BoosterConfig, readModelName: string, readModel: ReadModelInterface | undefined): Promise<any>
subscribe(config: BoosterConfig, subscriptionEnvelope: SubscriptionEnvelope): Promise<void>
fetchSubscriptions(config: BoosterConfig, subscriptionName: string): Promise<Array<SubscriptionEnvelope>>
deleteSubscription(config: BoosterConfig, connectionID: string, subscriptionID: string): Promise<void>
deleteAllSubscriptions(config: BoosterConfig, connectionID: string): Promise<void>
  • rawToEnvelopes: This method is used to transform all database data into read models envelopes.
  • fetch: Fetch a specific read model from the database.
  • search: This method receives a search query and it should return the read model search result.
  • store: Save a new read model projection on the database.
  • delete: Delete a read model from the database.
  • subscribe: This method is used to susbcribe a client to an specific read model.
  • fetchSubscriptions: Get the list of all clients subscribed to a specific read model.
  • deleteSubscription: Delete a specific read model subscription.
  • deleteAllSubscriptions: Delete all subscription for a specific read model.

GraphQL

This section of the API provides all necessary to receive and return GraphQL query from client side and create the return for requests:

export interface ProviderGraphQLLibrary {
rawToEnvelope(config: BoosterConfig, rawGraphQLRequest: unknown): Promise<GraphQLRequestEnvelope | GraphQLRequestEnvelopeError>
handleResult(result?: unknown, headers?: Record<string, string>): Promise<unknown>
}
  • rawToEnvelope: This method receives the request from the client with the GraphQL query and it should return the envelope object for the GraphQL query
  • handleResult This method receives the GraphQL results and it should return the response object for the client.

API responses

General API response management:

export interface ProviderAPIHandling {
requestSucceeded(body?: unknown, headers?: Record<string, number | string | ReadonlyArray<string>>): Promise<unknown>
requestFailed(error: Error): Promise<unknown>
}
  • requestSucceeded: This is a general method for processing sucess responses.
  • requestFailed: This is a general method for processing error responses.

Connections

This section of the API is in charge of the connection management for subscription at API gateway level:

export interface ProviderConnectionsLibrary {
storeData(config: BoosterConfig, connectionID: string, data: ConnectionDataEnvelope): Promise<void>
fetchData(config: BoosterConfig, connectionID: string): Promise<ConnectionDataEnvelope | undefined>
deleteData(config: BoosterConfig, connectionID: string): Promise<void>
sendMessage(config: BoosterConfig, connectionID: string, data: unknown): Promise<void>
}
  • storeData: This method receives all the information about the incoming connection and it should store the data on a database.
  • fetchData: Fetch the specific client connection information from the database.
  • deleteData: Delete all the information about a specific client connection.
  • sendMessage: Send a message to a specific client. This method get the message and destination as parameters and it should be able to fetch the connection information from the database and send the provided data to the client.

Scheduled

Finally, this section of the API is related to scheduled commands:

export interface ScheduledCommandsLibrary {
rawToEnvelope(config: BoosterConfig, rawMessage: unknown): Promise<ScheduledCommandEnvelope>
}
  • rawToEnvelope: as in other sections, this method is in charge to transform the scheduled command into a framework envelope.

Tips for developing custom providers

  • As a starting point, check the implementation of other providers to check how evertyhing is implemented.
  • Start the provider implementation by the infrastructure package because you will get all the infrastructure deployed and later the work with the runtime API will be easier.
  • If you need support during the development remember that you can have access to our Discord where some community members will can help you.
+ \ No newline at end of file diff --git a/going-deeper/custom-templates/index.html b/going-deeper/custom-templates/index.html index 51ee19987..8df042287 100644 --- a/going-deeper/custom-templates/index.html +++ b/going-deeper/custom-templates/index.html @@ -6,15 +6,15 @@ Customizing CLI resource templates | Booster Framework - +

Customizing CLI resource templates

You can change what the newly created Booster resources will contain by customizing the resource template files.

To do this, you first need to publish the resource templates by running the boost stub:publish command. This will create a folder stubs in the root directory of the project, and it will contain all the resources that you can customize:

stubs/
├─ command.stub
├─ entity.stub
├─ event.stub
├─ event-handler.stub
├─ read-model.stub
├─ scheduled-command.stub
└─ type.stub

After that, Booster CLI will start using your local templates instead of the default ones. Let's try this by adding a simple comment to the type.stub file.

// Look I am a comment that will now appear in every new type file 🐙
export class {{{ name }}} {
public constructor(
{{#fields}}
public {{{name}}}: {{{type}}},
{{/fields}}
) {}
}

Now if you run boost new:type CartItem --fields sku:string command, you will get common/cart-item.ts file with following content:

// Look I am a comment that will now appear in every new type file 🐙
export class CartItem {
public constructor(
public sku: string,
) {}
}

You did it, we just updated our resource template file! Now when you run `boost new:type', it will contain the comment you added earlier 🚀 -Of course, this is a simple example, and you may want to add new methods, import something, you name it!

Here are some answers to questions you may have:

QA

Can I have only one stub for a certain resource?
Yes! The resource generator will check if you have a custom template or it will use the default template
How can I keep up with new template updates?
1. Run `boost stub:publish --force` command
2. Review changes
3. Done!
Can I adjust the command template and leave the other resources as they are?
Yes. You can only have the `command.stub` file in the `/stubs` folder and customize it.
The generator will use the default templates for the other resources.
How can I use the default templates again!?
Simply delete the `/stubs` folder or a specific resource file.
What are these strange name, #fields, etc. things????
These are the variables and sections used by the mustache.js templating engine.
They allow us to dynamically generate new resources.
How do I change what `new:project` command generates?
At the moment there is no way to do this.
But in the future we will move the new project template from the CLI package( https://github.com/boostercloud/booster/issues/1078 ), and then you will be able to create and use your own templates for new projects.
I have another question!
You can ask questions on our Discord channel or create discussion on Github.
- +Of course, this is a simple example, and you may want to add new methods, import something, you name it!

Here are some answers to questions you may have:

QA

Can I have only one stub for a certain resource?
Yes! The resource generator will check if you have a custom template or it will use the default template
How can I keep up with new template updates?
1. Run `boost stub:publish --force` command
2. Review changes
3. Done!
Can I adjust the command template and leave the other resources as they are?
Yes. You can only have the `command.stub` file in the `/stubs` folder and customize it.
The generator will use the default templates for the other resources.
How can I use the default templates again!?
Simply delete the `/stubs` folder or a specific resource file.
What are these strange name, #fields, etc. things????
These are the variables and sections used by the mustache.js templating engine.
They allow us to dynamically generate new resources.
How do I change what `new:project` command generates?
At the moment there is no way to do this.
But in the future we will move the new project template from the CLI package( https://github.com/boostercloud/booster/issues/1078 ), and then you will be able to create and use your own templates for new projects.
I have another question!
You can ask questions on our Discord channel or create discussion on Github.
+ \ No newline at end of file diff --git a/going-deeper/data-migrations/index.html b/going-deeper/data-migrations/index.html index 7bc5c6fd2..9bf79efbf 100644 --- a/going-deeper/data-migrations/index.html +++ b/going-deeper/data-migrations/index.html @@ -6,13 +6,13 @@ Migrations | Booster Framework - +
-

Migrations

Migrations are a mechanism for updating or transforming the schemas of events and entities as your system evolves. This allows you to make changes to your data model without losing or corrupting existing data. There are two types of migration tools available in Booster: schema migrations and data migrations.

  • Schema migrations are used to incrementally upgrade an event or entity from a past version to the next. They are applied lazily, meaning that they are performed on-the-fly whenever an event or entity is loaded. This allows you to make changes to your data model without having to manually update all existing artifacts, and makes it possible to apply changes without running lenghty migration processes.

  • Data migrations, on the other hand, behave as background processes that can actively change the existing values in the database for existing entities and read models. They are particularly useful for data migrations that cannot be performed automatically with schema migrations, or for updating existing read models after a schema change.

Together, schema and data migrations provide a flexible and powerful toolset for managing the evolution of your data model over time.

Schema migrations

Booster handles classes annotated with @Migrates as schema migrations. The migration functions defined inside will update an existing artifact (either an event or an entity) from a previous version to a newer one whenever that artifact is visited. Schema migrations are applied to events and entities lazyly, meaning that they are only applied when the event or entity is loaded. This ensures that the migration process is non-disruptive and does not affect the performance of your system. Schema migrations are also performed on-the-fly and the results are not written back to the database, as events are not revisited once the next snapshot is written in the database.

For example, to upgrade a Product entity from version 1 to version 2, you can write the following migration class:

@Migrates(Product)
export class ProductMigration {
@ToVersion(2, { fromSchema: ProductV1, toSchema: ProductV2 })
public async changeNameFieldToDisplayName(old: ProductV1): Promise<ProductV2> {
return new ProductV2(
old.id,
old.sku,
old.name,
old.description,
old.price,
old.pictures,
old.deleted
)
}
}

Notice that we've used the @ToVersion decorator in the above example. This decorator not only tells Booster what schema upgrade this migration performs, it also informs it about the existence of a version, which is always an integer number. Booster will always use the latest version known to tag newly created artifacts, defaulting to 1 when no migrations are defined. This ensures that the schema of newly created events and entities is up-to-date and that they can be migrated as needed in the future.

The @ToVersion decorator takes two parameters in addition to the version: fromSchema and toSchema. The fromSchema parameter is set to ProductV1, while the toSchema parameter is set to ProductV2. This tells Booster that the migration is updating the Product object from version 1 (as defined by the ProductV1 schema) to version 2 (as defined by the ProductV2 schema).

As Booster can easily read the structure of your classes, the schemas are described as plain classes that you can maintain as part of your code. The ProductV1 class represents the schema of the previous version of the Product object with the properties and structure of the Product object as it was defined in version 1. The ProductV2 class is an alias for the latest version of the Product object. You can use the Product class here, there's no difference, but it's a good practice to create an alias for clarity.

It's a good practice to define the schema classes (ProductV1 and ProductV2) as non-exported classes in the same migration file. This allows you to see the changes made between versions and helps to understand how the migration works:

class ProductV1 {
public constructor(
public id: UUID,
readonly sku: string,
readonly name: string,
readonly description: string,
readonly price: Money,
readonly pictures: Array<Picture>,
public deleted: boolean = false
) {}
}

class ProductV2 extends Product {}

When you want to upgrade your artifacts from V2 to V3, you can add a new function decorated with @ToVersion to the same migrations class. You're free to structure the code the way you want, but we recommend keeping all migrations for the same artifact in the same migration class. For instance:

@Migrates(Product)
export class ProductMigration {
@ToVersion(2, { fromSchema: ProductV1, toSchema: ProductV2 })
public async changeNameFieldToDisplayName(old: ProductV1): Promise<ProductV2> {
return new ProductV2(
old.id,
old.sku,
old.name, // It's now called `displayName`
old.description,
old.price,
old.pictures,
old.deleted
)
}

@ToVersion(3, { fromSchema: ProductV2, toSchema: ProductV3 })
public async addNewField(old: ProductV2): Promise<ProductV3> {
return new ProductV3(
old.id,
old.sku,
old.displayName,
old.description,
old.price,
old.pictures,
old.deleted,
42 // We set a default value to initialize this field
)
}
}

In this example, the changeNameFieldToDisplayName function updates the Product entity from version 1 to version 2 by renaming the name field to displayName. Then, addNewField function updates the Product entity from version 2 to version 3 by adding a new field called newField to the entity's schema. Notice that at this point, your database could have snapshots set as v1, v2, or v3, so while it might be tempting to redefine the original migration to keep a single 1-to-3 migration, it's usually a good idea to keep the intermediate steps. This way Booster will be able to handle any scenario.

Data migrations

Data migrations can be seen as background processes that can actively update the values of existing entities and read models in the database. They can be useful to perform data migrations that cannot be handled with schema migrations, for example when you need to update the values exposed by the GraphQL API, or to initialize new read models that are projections of previously existing entities.

To create a data migration in Booster, you can use the @DataMigration decorator on a class that implements a start method. The @DataMigration decorator takes an object with a single parameter, order, which specifies the order in which the data migration should be run relative to other data migrations.

Data migrations are not run automatically, you need to invoke the BoosterDataMigrations.run() method from an event handler or a command. This will emit a BoosterDataMigrationStarted event, which will make Booster check for any pending migrations and run them in the specified order. A common pattern to be able to run migrations on demand is to add a special command, with access limited to an administrator role which calls this function.

Take into account that, depending on your cloud provider implementation, data migrations are executed in the context of a lambda or function app, so it's advisable to design these functions in a way that allow to re-run them in case of failures (i.e. lambda timeouts). In order to tell Booster that your migration has been applied successfully, at the end of each DataMigration.start method, you must emit a BoosterDataMigrationFinished event manually.

Inside your @DataMigration classes, you can use the BoosterDataMigrations.migrateEntity method to update the data for a specific entity. This method takes the old entity name, the old entity ID, and the new entity data as arguments. It will also generate an internal BoosterEntityMigrated event before performing the migration.

Note that Data migrations are only available in the Azure provider at the moment.

Here is an example of how you might use the @DataMigration decorator and the Booster.migrateEntity method to update the quantity of the first item in a cart (Notice that at the time of writing this document, the method Booster.entitiesIDs used in the following example is only available in the Azure provider, so you may need to approach the migration differently in AWS.):

@DataMigration({
order: 2,
})
export class CartIdDataMigrateV2 {
public constructor() {}


public static async start(register: Register): Promise<void> {
const entitiesIdsResult = await Booster.entitiesIDs('Cart', 500, undefined)
const paginatedEntityIdResults = entitiesIdsResult.items

const carts = await Promise.all(
paginatedEntityIdResults.map(async (entity) => await Booster.entity(Cart, entity.entityID))
)
return await Promise.all(
carts.map(async (cart) => {
cart.cartItems[0].quantity = 100
const newCart = new Cart(cart.id, cart.cartItems, cart.shippingAddress, cart.checks)
await BoosterDataMigrations.migrateEntity('Cart', validCart.id, newCart)
return validCart.id
})
)

register.events(new BoosterDataMigrationFinished('CartIdDataMigrateV2'))
}
}

Migrate from Previous Booster Versions

  • To migrate to new versions of Booster, check that you have the latest development dependencies required:
"devDependencies": {
"rimraf": "^5.0.0",
"@typescript-eslint/eslint-plugin": "4.22.1",
"@typescript-eslint/parser": "4.22.1",
"eslint": "7.26.0",
"eslint-config-prettier": "8.3.0",
"eslint-plugin-prettier": "3.4.0",
"mocha": "10.2.0",
"@types/mocha": "10.0.1",
"nyc": "15.1.0",
"prettier": "2.3.0",
"typescript": "4.5.4",
"ts-node": "9.1.1",
"@types/node": "15.0.2",
"ttypescript": "1.5.15",
"@boostercloud/metadata-booster": "0.30.2"
},
- +

Migrations

Migrations are a mechanism for updating or transforming the schemas of events and entities as your system evolves. This allows you to make changes to your data model without losing or corrupting existing data. There are two types of migration tools available in Booster: schema migrations and data migrations.

  • Schema migrations are used to incrementally upgrade an event or entity from a past version to the next. They are applied lazily, meaning that they are performed on-the-fly whenever an event or entity is loaded. This allows you to make changes to your data model without having to manually update all existing artifacts, and makes it possible to apply changes without running lenghty migration processes.

  • Data migrations, on the other hand, behave as background processes that can actively change the existing values in the database for existing entities and read models. They are particularly useful for data migrations that cannot be performed automatically with schema migrations, or for updating existing read models after a schema change.

Together, schema and data migrations provide a flexible and powerful toolset for managing the evolution of your data model over time.

Schema migrations

Booster handles classes annotated with @Migrates as schema migrations. The migration functions defined inside will update an existing artifact (either an event or an entity) from a previous version to a newer one whenever that artifact is visited. Schema migrations are applied to events and entities lazyly, meaning that they are only applied when the event or entity is loaded. This ensures that the migration process is non-disruptive and does not affect the performance of your system. Schema migrations are also performed on-the-fly and the results are not written back to the database, as events are not revisited once the next snapshot is written in the database.

For example, to upgrade a Product entity from version 1 to version 2, you can write the following migration class:

@Migrates(Product)
export class ProductMigration {
@ToVersion(2, { fromSchema: ProductV1, toSchema: ProductV2 })
public async changeNameFieldToDisplayName(old: ProductV1): Promise<ProductV2> {
return new ProductV2(
old.id,
old.sku,
old.name,
old.description,
old.price,
old.pictures,
old.deleted
)
}
}

Notice that we've used the @ToVersion decorator in the above example. This decorator not only tells Booster what schema upgrade this migration performs, it also informs it about the existence of a version, which is always an integer number. Booster will always use the latest version known to tag newly created artifacts, defaulting to 1 when no migrations are defined. This ensures that the schema of newly created events and entities is up-to-date and that they can be migrated as needed in the future.

The @ToVersion decorator takes two parameters in addition to the version: fromSchema and toSchema. The fromSchema parameter is set to ProductV1, while the toSchema parameter is set to ProductV2. This tells Booster that the migration is updating the Product object from version 1 (as defined by the ProductV1 schema) to version 2 (as defined by the ProductV2 schema).

As Booster can easily read the structure of your classes, the schemas are described as plain classes that you can maintain as part of your code. The ProductV1 class represents the schema of the previous version of the Product object with the properties and structure of the Product object as it was defined in version 1. The ProductV2 class is an alias for the latest version of the Product object. You can use the Product class here, there's no difference, but it's a good practice to create an alias for clarity.

It's a good practice to define the schema classes (ProductV1 and ProductV2) as non-exported classes in the same migration file. This allows you to see the changes made between versions and helps to understand how the migration works:

class ProductV1 {
public constructor(
public id: UUID,
readonly sku: string,
readonly name: string,
readonly description: string,
readonly price: Money,
readonly pictures: Array<Picture>,
public deleted: boolean = false
) {}
}

class ProductV2 extends Product {}

When you want to upgrade your artifacts from V2 to V3, you can add a new function decorated with @ToVersion to the same migrations class. You're free to structure the code the way you want, but we recommend keeping all migrations for the same artifact in the same migration class. For instance:

@Migrates(Product)
export class ProductMigration {
@ToVersion(2, { fromSchema: ProductV1, toSchema: ProductV2 })
public async changeNameFieldToDisplayName(old: ProductV1): Promise<ProductV2> {
return new ProductV2(
old.id,
old.sku,
old.name, // It's now called `displayName`
old.description,
old.price,
old.pictures,
old.deleted
)
}

@ToVersion(3, { fromSchema: ProductV2, toSchema: ProductV3 })
public async addNewField(old: ProductV2): Promise<ProductV3> {
return new ProductV3(
old.id,
old.sku,
old.displayName,
old.description,
old.price,
old.pictures,
old.deleted,
42 // We set a default value to initialize this field
)
}
}

In this example, the changeNameFieldToDisplayName function updates the Product entity from version 1 to version 2 by renaming the name field to displayName. Then, addNewField function updates the Product entity from version 2 to version 3 by adding a new field called newField to the entity's schema. Notice that at this point, your database could have snapshots set as v1, v2, or v3, so while it might be tempting to redefine the original migration to keep a single 1-to-3 migration, it's usually a good idea to keep the intermediate steps. This way Booster will be able to handle any scenario.

Data migrations

Data migrations can be seen as background processes that can actively update the values of existing entities and read models in the database. They can be useful to perform data migrations that cannot be handled with schema migrations, for example when you need to update the values exposed by the GraphQL API, or to initialize new read models that are projections of previously existing entities.

To create a data migration in Booster, you can use the @DataMigration decorator on a class that implements a start method. The @DataMigration decorator takes an object with a single parameter, order, which specifies the order in which the data migration should be run relative to other data migrations.

Data migrations are not run automatically, you need to invoke the BoosterDataMigrations.run() method from an event handler or a command. This will emit a BoosterDataMigrationStarted event, which will make Booster check for any pending migrations and run them in the specified order. A common pattern to be able to run migrations on demand is to add a special command, with access limited to an administrator role which calls this function.

Take into account that, depending on your cloud provider implementation, data migrations are executed in the context of a lambda or function app, so it's advisable to design these functions in a way that allow to re-run them in case of failures (i.e. lambda timeouts). In order to tell Booster that your migration has been applied successfully, at the end of each DataMigration.start method, you must emit a BoosterDataMigrationFinished event manually.

Inside your @DataMigration classes, you can use the BoosterDataMigrations.migrateEntity method to update the data for a specific entity. This method takes the old entity name, the old entity ID, and the new entity data as arguments. It will also generate an internal BoosterEntityMigrated event before performing the migration.

Note that Data migrations are only available in the Azure provider at the moment.

Here is an example of how you might use the @DataMigration decorator and the Booster.migrateEntity method to update the quantity of the first item in a cart (Notice that at the time of writing this document, the method Booster.entitiesIDs used in the following example is only available in the Azure provider, so you may need to approach the migration differently in AWS.):

@DataMigration({
order: 2,
})
export class CartIdDataMigrateV2 {
public constructor() {}


public static async start(register: Register): Promise<void> {
const entitiesIdsResult = await Booster.entitiesIDs('Cart', 500, undefined)
const paginatedEntityIdResults = entitiesIdsResult.items

const carts = await Promise.all(
paginatedEntityIdResults.map(async (entity) => await Booster.entity(Cart, entity.entityID))
)
return await Promise.all(
carts.map(async (cart) => {
cart.cartItems[0].quantity = 100
const newCart = new Cart(cart.id, cart.cartItems, cart.shippingAddress, cart.checks)
await BoosterDataMigrations.migrateEntity('Cart', validCart.id, newCart)
return validCart.id
})
)

register.events(new BoosterDataMigrationFinished('CartIdDataMigrateV2'))
}
}

Migrate from Previous Booster Versions

  • To migrate to new versions of Booster, check that you have the latest development dependencies required:
"devDependencies": {
"rimraf": "^5.0.0",
"@typescript-eslint/eslint-plugin": "4.22.1",
"@typescript-eslint/parser": "4.22.1",
"eslint": "7.26.0",
"eslint-config-prettier": "8.3.0",
"eslint-plugin-prettier": "3.4.0",
"mocha": "10.2.0",
"@types/mocha": "10.0.1",
"nyc": "15.1.0",
"prettier": "2.3.0",
"typescript": "4.5.4",
"ts-node": "9.1.1",
"@types/node": "15.0.2",
"ttypescript": "1.5.15",
"@boostercloud/metadata-booster": "0.30.2"
},
+ \ No newline at end of file diff --git a/going-deeper/environment-configuration/index.html b/going-deeper/environment-configuration/index.html index 8b014ba22..ba973103d 100644 --- a/going-deeper/environment-configuration/index.html +++ b/going-deeper/environment-configuration/index.html @@ -6,13 +6,13 @@ Environments | Booster Framework - +
-

Environments

You can create multiple environments calling the Booster.configure function several times using different environment names as the first argument. You can create one file for each environment, but it is not required. In this example we set all environments in a single file:

// Here we use a single file called src/config.ts, but you can use separate files for each environment too.
import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'
// A provider that deploys your app to AWS:

Booster.configure('stage', (config: BoosterConfig): void => {
config.appName = 'fruit-store-stage'
config.providerPackage = '@boostercloud/framework-provider-aws'
})

Booster.configure('prod', (config: BoosterConfig): void => {
config.appName = 'fruit-store-prod'
config.providerPackage = '@boostercloud/framework-provider-aws'
})

It is also possible to place an environment configuration in a separated file. Let's say that a developer called "John" created its own configuration file src/config/john.ts. The content would be the following:

import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'
import * as AWS from

Booster.configure('john', (config: BoosterConfig): void => {
config.appName = 'john-fruit-store'
config.providerPackage = '@boostercloud/framework-provider-aws'
})

The environment name will be required by any command from the Booster CLI that depends on the provider. For instance, when you deploy your application, you'll need to specify on which environment you want to deploy it:

boost deploy -e prod

This way, you can have different configurations depending on your needs.

Booster environments are extremely flexible. As shown in the first example, your 'fruit-store' app can have three team-wide environments: 'dev', 'stage', and 'prod', each of them with different app names or providers, that are deployed by your CI/CD processes. Developers, like "John" in the second example, can create their own private environments in separate config files to test their changes in realistic environments before committing them. Likewise, CI/CD processes could generate separate production-like environments to test different branches to perform QA in separate environments without interferences from other features under test.

The only thing you need to do to deploy a whole new completely-independent copy of your application is to use a different name. Also, Booster uses the credentials available in the machine (~/.aws/credentials in AWS) that performs the deployment process, so developers can even work on separate accounts than production or staging environments.

- +

Environments

You can create multiple environments calling the Booster.configure function several times using different environment names as the first argument. You can create one file for each environment, but it is not required. In this example we set all environments in a single file:

// Here we use a single file called src/config.ts, but you can use separate files for each environment too.
import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'
// A provider that deploys your app to AWS:

Booster.configure('stage', (config: BoosterConfig): void => {
config.appName = 'fruit-store-stage'
config.providerPackage = '@boostercloud/framework-provider-aws'
})

Booster.configure('prod', (config: BoosterConfig): void => {
config.appName = 'fruit-store-prod'
config.providerPackage = '@boostercloud/framework-provider-aws'
})

It is also possible to place an environment configuration in a separated file. Let's say that a developer called "John" created its own configuration file src/config/john.ts. The content would be the following:

import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'
import * as AWS from

Booster.configure('john', (config: BoosterConfig): void => {
config.appName = 'john-fruit-store'
config.providerPackage = '@boostercloud/framework-provider-aws'
})

The environment name will be required by any command from the Booster CLI that depends on the provider. For instance, when you deploy your application, you'll need to specify on which environment you want to deploy it:

boost deploy -e prod

This way, you can have different configurations depending on your needs.

Booster environments are extremely flexible. As shown in the first example, your 'fruit-store' app can have three team-wide environments: 'dev', 'stage', and 'prod', each of them with different app names or providers, that are deployed by your CI/CD processes. Developers, like "John" in the second example, can create their own private environments in separate config files to test their changes in realistic environments before committing them. Likewise, CI/CD processes could generate separate production-like environments to test different branches to perform QA in separate environments without interferences from other features under test.

The only thing you need to do to deploy a whole new completely-independent copy of your application is to use a different name. Also, Booster uses the credentials available in the machine (~/.aws/credentials in AWS) that performs the deployment process, so developers can even work on separate accounts than production or staging environments.

+ \ No newline at end of file diff --git a/going-deeper/framework-packages/index.html b/going-deeper/framework-packages/index.html index 61740c6cf..0b172b2ac 100644 --- a/going-deeper/framework-packages/index.html +++ b/going-deeper/framework-packages/index.html @@ -6,13 +6,13 @@ Framework packages | Booster Framework - +
-

Framework packages

The framework is already splitted into different packages:

Framework Core

The framework-core package includes the most important components of the framework abstraction. It can be seen as skeleton or the main architecture of the framework.

The package defines the specification of how should a Booster application work without taking into account the specific providers that could be used. Every Booster provider package is based on the components that the framework core needs in order to work on the platform.

Framework Types

The framework-types packages includes the types that define the domain of the Booster framework. It defines domain concepts like an Event, a Command or a Role.

- +

Framework packages

The framework is already splitted into different packages:

Framework Core

The framework-core package includes the most important components of the framework abstraction. It can be seen as skeleton or the main architecture of the framework.

The package defines the specification of how should a Booster application work without taking into account the specific providers that could be used. Every Booster provider package is based on the components that the framework core needs in order to work on the platform.

Framework Types

The framework-types packages includes the types that define the domain of the Booster framework. It defines domain concepts like an Event, a Command or a Role.

+ \ No newline at end of file diff --git a/going-deeper/infrastructure-providers/index.html b/going-deeper/infrastructure-providers/index.html index 8fe967296..0d3d944e9 100644 --- a/going-deeper/infrastructure-providers/index.html +++ b/going-deeper/infrastructure-providers/index.html @@ -6,7 +6,7 @@ Configuring Infrastructure Providers | Booster Framework - + @@ -20,8 +20,8 @@ if you're not planning to keep using it.

Creating an AWS account

If you don't have an AWS account, you can create one by following the instructions in the AWS documentation.

Getting the AWS credentials

Once you have an AWS account, you need to get the credentials that Booster needs to deploy your application. You can follow the instructions in the AWS documentation to get them.

Booster needs you to get the following credentials:

  • Access Key ID
  • Secret Access Key

Make sure you get them, as they will be needed in the next step.

Setting the AWS credentials on Booster

Booster needs to know how to authenticate against your AWS account. For that reason, create a folder called .aws under your home folder, and a file inside called credentials with this syntax:

~/.aws/credentials
[default]
aws_access_key_id=<YOUR ACCESS KEY ID>
aws_secret_access_key=<YOUR SECRET ACCESS KEY>

AWS Provider configuration

To configure AWS as a provider you need to meet certain prerequisites:

  • Check @boostercloud/framework-provider-aws is listed in your app package.json dependencies.
  • Check @boostercloud/framework-provider-aws-infrastructure is listed in your app package.json devDependencies.
  • Check both dependencies are installed, otherwise use npm install in the root of your project.

Now go to your config.ts file, import the aws provider library and set up your app environment.

import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'
import { Provider as AWSProvider } from

Booster.configure('production', (config: BoosterConfig): void => {
config.appName = 'my-app-name'
config.providePackage = '@boostercloud/framework-provider-aws'
})

Open your terminal and run the deployment command

boost deploy -e production

Now just let the magic happen, Booster will create everything for you and give you back your app ready to use URL. 🚀

Azure Provider Setup

Booster applications can be deployed to Microsoft Azure. To do so, you need to have an Azure account and to have the Azure CLI installed on your computer.

caution

Booster is free to use, but remember that the resources deployed to your cloud provider might generate some expenses.

In Azure, when you're not eligible for the free tier, resources are charged on-demand. Deploying a Booster project and sending a few commands and queries should cost just a few cents.

In any case, make sure to un-deploy your application with the command boost nuke -e production -if you're not planning to keep using it.

Creating an Azure account

As mentioned, you need to have an Azure account. If you don't have one, you can create one from the Microsoft SignUp page. You can also use your existing Microsoft account to create an Azure account.

Installing the Azure CLI

Once you have created the Azure account, you need to install the Azure CLI on your computer. You can do it by following the instructions on the official documentation. You may also need to install jq on your system.

Azure Provider configuration

To configure Azure as a provider you need to meet certain prerequisites:

  • Install jq in your system.
  • Install the terraform CLI.
  • Check @boostercloud/framework-provider-azure is listed in your app package.json dependencies.
  • Check @boostercloud/framework-provider-azure-infrastructure is listed in your app package.json devDependencies.
  • Check both dependencies are installed, otherwise use npm install in the root of your project.

At this moment you have to log in you Azure account using the Azure CLI with the following command.

az login

You will get something like this:

[
{
"cloudName": "AzureCloud",
"homeTenantId": "00000000-0000-0000-0000-000000000000",
"id": "00000000-0000-0000-0000-000000000000",
"isDefault": true,
"managedByTenants": [],
"name": "Azure subscription name",
"state": "Enabled",
"tenantId": "00000000-0000-0000-0000-000000000000",
"user": {
"name": "boosteduser@boosteddomain.com",
"type": "user"
}
}
]

Keep the id from the login output around.

Then create a service pricipal that is a contributor to a chosen subscription running the following command.

az ad sp create-for-rbac --name <service-principal-name> --role="Contributor" --scopes="/subscriptions/<the-id-from-the-login-output>"
note

Remember to change <service-principal-name> for a custom one.

After the service principal is created, create a bash script with the following content. It will set up the necessary environment variables required by the provider in order to work:

#!/usr/bin/env bash

SP_DISPLAY_NAME="<service-principal-name>" #replace <service-principal-name> with the name of your own SP
REGION="East US" #replace with a region of your choice, see full list here: https://azure.microsoft.com/en-us/global-infrastructure/locations/

export AZURE_APP_ID=$(az ad sp list --display-name ${SP_DISPLAY_NAME} | jq -r '.[].appId')
export AZURE_TENANT_ID=$(az ad sp list --display-name ${SP_DISPLAY_NAME} | jq -r '.[].appOwnerOrganizationId')
export AZURE_SECRET=$(az ad sp credential reset --id ${AZURE_APP_ID} | jq -r '.password')
export AZURE_SUBSCRIPTION_ID=$(az account show | jq -r '.id')
export REGION=$REGION
note

Remember to have jq installed in your system.

Now go to your config.ts file, import the aws provider library and set up your app environment.

import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'

Booster.configure('production', (config: BoosterConfig): void => {
config.appName = 'my-app-name'
config.providerPackage = '@boostercloud/framework-provider-azure'
})

Open your terminal and run the bash file to export you env variables and the deploy command

source <path-to-your-bash-file> && boost deploy -e production
note

Remember to have the terraform CLI installed in your system.

Now just let the magic happen, Booster will create everything for you and give you back your app ready to use URL. 🚀

Azure synth command

Azure provider implements the experimental Booster synth command. This command will generate Terraform templates from your code. It will also generate needed files to deploy your Booster application using cdktf.

Running synth command, for example boost synth -e production will generate following files:

  • A file cdktf.json: A basic json file to deploy your application using cdktf
  • A folder cdktf.out: with the Terraform templates.

Booster deploy command for Azure will deploy your application using the generated templates. You don't need to run the synth command for deploy your application, the deploy command will generate the templates before deploy for you.

Once you have the new files and folders generates you could use cdktf to deploy your application if you want to.

Azure and CI/CD environments

It is possible to deploy your Azure infrastructure using the Terraform templates generated by the synth command using Terraform executable.

To deploy a Booster application using the Terraform templates generated by the Booster synth command:

  1. Navigate to
> cd cdktf.out/stacks/<application name><environment name>
  1. Run (only the first time)
> terraform init
  1. Run
> terraform plan --out plan
  1. Run
> terraform apply "plan"

You could follow similar steps to integrate the Azure Booster deploys in your CI/CD environment.

  1. Navigate to
> cd cdktf.out/stacks/<application name><environment name>
  1. Copy functionApp.zip to the destination folder
> cp functionApp.zip <destination>

After copying the files you should have the following structure:

<application>
├── cdktf.out
│ └── stacks
│ └── <application name><environment name>
│ └── cdk.tf.json

Now deploy the template:

  1. Run (only the first time)
> terraform init
  1. Run
> terraform plan --out plan
  1. Run
> terraform apply "plan"

Finally, you need to upload the source code. The main options are (more info):

  1. Using az-cli. Run
> az functionapp deployment source config-zip -g <resource_group> -n \
<app_name> --src ./functionApp.json
  1. Using REST APIs. Send a POST request to https://<app_name>.scm.azurewebsites.net/api/zipdeploy. Example:
>  curl -X POST -u <deployment_user> --data-binary @"<zip_file_path>" https://<app_name>.scm.azurewebsites.net/api/zipdeploy
note

Remember to follow the Azure Provider steps in this page to set up your credentials correctly

Azure host.json file

Azure Provider will generate a default host.json file if there is not a host.json entry in the config.assets array.

If you want to use your own host.json file just add it to config.assets array and Booster will use yours.

Local Provider

All Booster projects come with a local development environment configured by default, so you can test your app before deploying it to the cloud.

You can see the configured local environment in your src/config/config.ts file:

Booster.configure('local', (config: BoosterConfig): void => {
config.appName = 'my-store'
config.providerPackage = '@boostercloud/framework-provider-local'
})

In order to start your application using the local provider, use the following command:

boost start -e local

Where local is one of your defined environments with the Booster.configure call.

Cleaning Local Data for New Changes

When making changes to your entities and events, you might need to reset the local environment to accommodate these changes. The application creates a .booster folder to store relevant information. To clean the local data and reset the environment:

  1. Locate the .booster folder in your project directory.
  2. Delete the contents of the .booster folder or the folder itself.

This action will clear the local data and allow you to proceed with your new changes effectively.

- +if you're not planning to keep using it.

Creating an Azure account

As mentioned, you need to have an Azure account. If you don't have one, you can create one from the Microsoft SignUp page. You can also use your existing Microsoft account to create an Azure account.

Installing the Azure CLI

Once you have created the Azure account, you need to install the Azure CLI on your computer. You can do it by following the instructions on the official documentation. You may also need to install jq on your system.

Azure Provider configuration

To configure Azure as a provider you need to meet certain prerequisites:

  • Install jq in your system.
  • Install the terraform CLI.
  • Check @boostercloud/framework-provider-azure is listed in your app package.json dependencies.
  • Check @boostercloud/framework-provider-azure-infrastructure is listed in your app package.json devDependencies.
  • Check both dependencies are installed, otherwise use npm install in the root of your project.

At this moment you have to log in you Azure account using the Azure CLI with the following command.

az login

You will get something like this:

[
{
"cloudName": "AzureCloud",
"homeTenantId": "00000000-0000-0000-0000-000000000000",
"id": "00000000-0000-0000-0000-000000000000",
"isDefault": true,
"managedByTenants": [],
"name": "Azure subscription name",
"state": "Enabled",
"tenantId": "00000000-0000-0000-0000-000000000000",
"user": {
"name": "boosteduser@boosteddomain.com",
"type": "user"
}
}
]

Keep the id from the login output around.

Then create a service pricipal that is a contributor to a chosen subscription running the following command.

az ad sp create-for-rbac --name <service-principal-name> --role="Contributor" --scopes="/subscriptions/<the-id-from-the-login-output>"
note

Remember to change <service-principal-name> for a custom one.

After the service principal is created, create a bash script with the following content. It will set up the necessary environment variables required by the provider in order to work:

#!/usr/bin/env bash

SP_DISPLAY_NAME="<service-principal-name>" #replace <service-principal-name> with the name of your own SP
REGION="East US" #replace with a region of your choice, see full list here: https://azure.microsoft.com/en-us/global-infrastructure/locations/

export AZURE_APP_ID=$(az ad sp list --display-name ${SP_DISPLAY_NAME} | jq -r '.[].appId')
export AZURE_TENANT_ID=$(az ad sp list --display-name ${SP_DISPLAY_NAME} | jq -r '.[].appOwnerOrganizationId')
export AZURE_SECRET=$(az ad sp credential reset --id ${AZURE_APP_ID} | jq -r '.password')
export AZURE_SUBSCRIPTION_ID=$(az account show | jq -r '.id')
export REGION=$REGION
note

Remember to have jq installed in your system.

Now go to your config.ts file, import the aws provider library and set up your app environment.

import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'

Booster.configure('production', (config: BoosterConfig): void => {
config.appName = 'my-app-name'
config.providerPackage = '@boostercloud/framework-provider-azure'
})

Open your terminal and run the bash file to export you env variables and the deploy command

source <path-to-your-bash-file> && boost deploy -e production
note

Remember to have the terraform CLI installed in your system.

Now just let the magic happen, Booster will create everything for you and give you back your app ready to use URL. 🚀

Azure synth command

Azure provider implements the experimental Booster synth command. This command will generate Terraform templates from your code. It will also generate needed files to deploy your Booster application using cdktf.

Running synth command, for example boost synth -e production will generate following files:

  • A file cdktf.json: A basic json file to deploy your application using cdktf
  • A folder cdktf.out: with the Terraform templates.

Booster deploy command for Azure will deploy your application using the generated templates. You don't need to run the synth command for deploy your application, the deploy command will generate the templates before deploy for you.

Once you have the new files and folders generates you could use cdktf to deploy your application if you want to.

Azure and CI/CD environments

It is possible to deploy your Azure infrastructure using the Terraform templates generated by the synth command using Terraform executable.

To deploy a Booster application using the Terraform templates generated by the Booster synth command:

  1. Navigate to
> cd cdktf.out/stacks/<application name><environment name>
  1. Run (only the first time)
> terraform init
  1. Run
> terraform plan --out plan
  1. Run
> terraform apply "plan"

You could follow similar steps to integrate the Azure Booster deploys in your CI/CD environment.

  1. Navigate to
> cd cdktf.out/stacks/<application name><environment name>
  1. Copy functionApp.zip to the destination folder
> cp functionApp.zip <destination>

After copying the files you should have the following structure:

<application>
├── cdktf.out
│ └── stacks
│ └── <application name><environment name>
│ └── cdk.tf.json

Now deploy the template:

  1. Run (only the first time)
> terraform init
  1. Run
> terraform plan --out plan
  1. Run
> terraform apply "plan"

Finally, you need to upload the source code. The main options are (more info):

  1. Using az-cli. Run
> az functionapp deployment source config-zip -g <resource_group> -n \
<app_name> --src ./functionApp.json
  1. Using REST APIs. Send a POST request to https://<app_name>.scm.azurewebsites.net/api/zipdeploy. Example:
>  curl -X POST -u <deployment_user> --data-binary @"<zip_file_path>" https://<app_name>.scm.azurewebsites.net/api/zipdeploy
note

Remember to follow the Azure Provider steps in this page to set up your credentials correctly

Azure host.json file

Azure Provider will generate a default host.json file if there is not a host.json entry in the config.assets array.

If you want to use your own host.json file just add it to config.assets array and Booster will use yours.

Local Provider

All Booster projects come with a local development environment configured by default, so you can test your app before deploying it to the cloud.

You can see the configured local environment in your src/config/config.ts file:

Booster.configure('local', (config: BoosterConfig): void => {
config.appName = 'my-store'
config.providerPackage = '@boostercloud/framework-provider-local'
})

In order to start your application using the local provider, use the following command:

boost start -e local

Where local is one of your defined environments with the Booster.configure call.

Cleaning Local Data for New Changes

When making changes to your entities and events, you might need to reset the local environment to accommodate these changes. The application creates a .booster folder to store relevant information. To clean the local data and reset the environment:

  1. Locate the .booster folder in your project directory.
  2. Delete the contents of the .booster folder or the folder itself.

This action will clear the local data and allow you to proceed with your new changes effectively.

+ \ No newline at end of file diff --git a/going-deeper/instrumentation/index.html b/going-deeper/instrumentation/index.html index 88f11937c..7b1311b5b 100644 --- a/going-deeper/instrumentation/index.html +++ b/going-deeper/instrumentation/index.html @@ -6,13 +6,13 @@ Booster instrumentation | Booster Framework - +
-

Booster instrumentation

Trace Decorator

The Trace Decorator is a Booster functionality that facilitates the reception of notifications whenever significant events occur in Booster's core, such as event dispatching or migration execution.

Usage

To configure a custom tracer, you need to define an object with two methods: onStart and onEnd. The onStart method is called before the traced method is invoked, and the onEnd method is called after the method completes. Both methods receive a TraceInfo object, which contains information about the traced method and its arguments.

Here's an example of a custom tracer that logs trace events to the console:

import {
TraceParameters,
BoosterConfig,
TraceActionTypes,
} from '@boostercloud/framework-types'

class MyTracer {
static async onStart(config: BoosterConfig, actionType: string, traceParameters: TraceParameters): Promise<void> {
console.log(`Start ${actionType}: ${traceParameters.className}.${traceParameters.methodName}`)
}

static async onEnd(config: BoosterConfig, actionType: string, traceParameters: TraceParameters): Promise<void> {
console.log(`End ${actionType}: ${traceParameters.className}.${traceParameters.methodName}`)
}
}

You can then configure the tracer in your Booster application's configuration:

import { BoosterConfig } from '@boostercloud/framework-types'
import { MyTracer } from './my-tracer'

const config: BoosterConfig = {
// ...other configuration options...
trace: {
enableTraceNotification: true,
onStart: MyTracer.onStart,
onEnd: MyTracer.onStart,
}
}

In the configuration above, we've enabled trace notifications and specified our onStart and onEnd as the methods to use. Verbose disable will reduce the amount of information generated excluding the internal parameter in the trace parameters.

Setting enableTraceNotification: true would enable the trace for all actions. You can either disable them by setting it to false or selectively enable only specific actions using an array of TraceActionTypes.

import { BoosterConfig, TraceActionTypes } from '@boostercloud/framework-types'
import { MyTracer } from './my-tracer'

const config: BoosterConfig = {
// ...other configuration options...
trace: {
enableTraceNotification: [TraceActionTypes.DISPATCH_EVENT, TraceActionTypes.MIGRATION_RUN, 'OTHER'],
includeInternal: false,
onStart: MyTracer.onStart,
onEnd: MyTracer.onStart,
}
}

In this example, only DISPATCH_EVENT, MIGRATION_RUN and 'OTHER' actions will trigger trace notifications.

TraceActionTypes

The TraceActionTypes enum defines all the traceable actions in Booster's core:

export enum TraceActionTypes {
CUSTOM,
EVENT_HANDLERS_PROCESS,
HANDLE_EVENT,
DISPATCH_ENTITY_TO_EVENT_HANDLERS,
DISPATCH_EVENTS,
FETCH_ENTITY_SNAPSHOT,
STORE_SNAPSHOT,
LOAD_LATEST_SNAPSHOT,
LOAD_EVENT_STREAM_SINCE,
ENTITY_REDUCER,
READ_MODEL_FIND_BY_ID,
GRAPHQL_READ_MODEL_SEARCH,
READ_MODEL_SEARCH,
COMMAND_HANDLER,
MIGRATION_RUN,
GRAPHQL_DISPATCH,
GRAPHQL_RUN_OPERATION,
SCHEDULED_COMMAND_HANDLER,
DISPATCH_SUBSCRIBER_NOTIFIER,
READ_MODEL_SCHEMA_MIGRATOR_RUN,
SCHEMA_MIGRATOR_MIGRATE,
}

TraceInfo

The TraceInfo interface defines the data that is passed to the tracer's onBefore and onAfter methods:

export interface TraceInfo {
className: string
methodName: string
args: Array<unknown>
traceId: UUID
elapsedInvocationMillis?: number
internal: {
target: unknown
descriptor: PropertyDescriptor
}
description?: string
}

className and methodName identify the function that is being traced.

Adding the Trace Decorator to Your own async methods

In addition to using the Trace Decorator to receive notifications when events occur in Booster's core, you can also use it to trace your own methods. To add the Trace Decorator to your own methods, simply add @Trace() before your method declaration.

Here's an example of how to use the Trace Decorator on a custom method:

import { Trace } from '@boostercloud/framework-core'
import { BoosterConfig, Logger } from '@boostercloud/framework-types'

export class MyCustomClass {
@Trace('OTHER')
public async myCustomMethod(config: BoosterConfig, logger: Logger): Promise<void> {
logger.debug('This is my custom method')
// Do some custom logic here...
}
}

In the example above, we added the @Trace('OTHER') decorator to the myCustomMethod method. This will cause the method to emit trace events when it's invoked, allowing you to trace the flow of your application and detect performance bottlenecks or errors.

Note that when you add the Trace Decorator to your own methods, you'll need to configure your Booster instance to use a tracer that implements the necessary methods to handle these events.

- +

Booster instrumentation

Trace Decorator

The Trace Decorator is a Booster functionality that facilitates the reception of notifications whenever significant events occur in Booster's core, such as event dispatching or migration execution.

Usage

To configure a custom tracer, you need to define an object with two methods: onStart and onEnd. The onStart method is called before the traced method is invoked, and the onEnd method is called after the method completes. Both methods receive a TraceInfo object, which contains information about the traced method and its arguments.

Here's an example of a custom tracer that logs trace events to the console:

import {
TraceParameters,
BoosterConfig,
TraceActionTypes,
} from '@boostercloud/framework-types'

class MyTracer {
static async onStart(config: BoosterConfig, actionType: string, traceParameters: TraceParameters): Promise<void> {
console.log(`Start ${actionType}: ${traceParameters.className}.${traceParameters.methodName}`)
}

static async onEnd(config: BoosterConfig, actionType: string, traceParameters: TraceParameters): Promise<void> {
console.log(`End ${actionType}: ${traceParameters.className}.${traceParameters.methodName}`)
}
}

You can then configure the tracer in your Booster application's configuration:

import { BoosterConfig } from '@boostercloud/framework-types'
import { MyTracer } from './my-tracer'

const config: BoosterConfig = {
// ...other configuration options...
trace: {
enableTraceNotification: true,
onStart: MyTracer.onStart,
onEnd: MyTracer.onStart,
}
}

In the configuration above, we've enabled trace notifications and specified our onStart and onEnd as the methods to use. Verbose disable will reduce the amount of information generated excluding the internal parameter in the trace parameters.

Setting enableTraceNotification: true would enable the trace for all actions. You can either disable them by setting it to false or selectively enable only specific actions using an array of TraceActionTypes.

import { BoosterConfig, TraceActionTypes } from '@boostercloud/framework-types'
import { MyTracer } from './my-tracer'

const config: BoosterConfig = {
// ...other configuration options...
trace: {
enableTraceNotification: [TraceActionTypes.DISPATCH_EVENT, TraceActionTypes.MIGRATION_RUN, 'OTHER'],
includeInternal: false,
onStart: MyTracer.onStart,
onEnd: MyTracer.onStart,
}
}

In this example, only DISPATCH_EVENT, MIGRATION_RUN and 'OTHER' actions will trigger trace notifications.

TraceActionTypes

The TraceActionTypes enum defines all the traceable actions in Booster's core:

export enum TraceActionTypes {
CUSTOM,
EVENT_HANDLERS_PROCESS,
HANDLE_EVENT,
DISPATCH_ENTITY_TO_EVENT_HANDLERS,
DISPATCH_EVENTS,
FETCH_ENTITY_SNAPSHOT,
STORE_SNAPSHOT,
LOAD_LATEST_SNAPSHOT,
LOAD_EVENT_STREAM_SINCE,
ENTITY_REDUCER,
READ_MODEL_FIND_BY_ID,
GRAPHQL_READ_MODEL_SEARCH,
READ_MODEL_SEARCH,
COMMAND_HANDLER,
MIGRATION_RUN,
GRAPHQL_DISPATCH,
GRAPHQL_RUN_OPERATION,
SCHEDULED_COMMAND_HANDLER,
DISPATCH_SUBSCRIBER_NOTIFIER,
READ_MODEL_SCHEMA_MIGRATOR_RUN,
SCHEMA_MIGRATOR_MIGRATE,
}

TraceInfo

The TraceInfo interface defines the data that is passed to the tracer's onBefore and onAfter methods:

export interface TraceInfo {
className: string
methodName: string
args: Array<unknown>
traceId: UUID
elapsedInvocationMillis?: number
internal: {
target: unknown
descriptor: PropertyDescriptor
}
description?: string
}

className and methodName identify the function that is being traced.

Adding the Trace Decorator to Your own async methods

In addition to using the Trace Decorator to receive notifications when events occur in Booster's core, you can also use it to trace your own methods. To add the Trace Decorator to your own methods, simply add @Trace() before your method declaration.

Here's an example of how to use the Trace Decorator on a custom method:

import { Trace } from '@boostercloud/framework-core'
import { BoosterConfig, Logger } from '@boostercloud/framework-types'

export class MyCustomClass {
@Trace('OTHER')
public async myCustomMethod(config: BoosterConfig, logger: Logger): Promise<void> {
logger.debug('This is my custom method')
// Do some custom logic here...
}
}

In the example above, we added the @Trace('OTHER') decorator to the myCustomMethod method. This will cause the method to emit trace events when it's invoked, allowing you to trace the flow of your application and detect performance bottlenecks or errors.

Note that when you add the Trace Decorator to your own methods, you'll need to configure your Booster instance to use a tracer that implements the necessary methods to handle these events.

+ \ No newline at end of file diff --git a/going-deeper/register/index.html b/going-deeper/register/index.html index a3a64feef..09e25ee92 100644 --- a/going-deeper/register/index.html +++ b/going-deeper/register/index.html @@ -6,13 +6,13 @@ Advanced uses of the Register object | Booster Framework - +
-

Advanced uses of the Register object

The Register object is a built-in object that is automatically injected by the framework into all command or event handlers to let users interact with the execution context. It can be used for a variety of purposes, including:

  • Registering events to be emitted at the end of the command or event handler
  • Manually flush the events to be persisted synchronously to the event store
  • Access the current signed in user, their roles and other claims included in their JWT token
  • In a command: Access the request context or alter the HTTP response headers

Registering events

When handling a command or event, you can use the Register object to register one or more events that will be emitted when the command or event handler is completed. Events are registered using the register.events() method, which takes one or more events as arguments. For example:

public async handle(register: Register): Promise<void> {
// Do some work...
register.events(new OrderConfirmed(this.orderID))
// Do more work...
}

In this example, we're registering an OrderConfirmed event to be persisted to the event store when the handler finishes. You can also register multiple events by passing them as separate arguments to the register.events() method:

public async handle(register: Register): Promise<void> {
// Do some work...
register.events(
new OrderConfirmed(this.orderID),
new OrderShipped(this.orderID)
)
// Do more work...
}

It's worth noting that events registered with register.events() aren't immediately persisted to the event store. Instead, they're stored in memory until the command or event handler finishes executing. To force the events to be persisted immediately, you can call the register.flush() method that is described in the next section.

Manually flush the events

As mentioned in the previous section, events registered with register.events() aren't immediately persisted to the event store. Instead, they're stored in memory until the command or event handler finishes its execution, but this doesn't work in all situations, sometimes it's useful to store partial updates of a longer process, and some scenarios could accept partial successes. To force the events to be persisted and wait for the database to confirm the write, you can use the register.flush() method.

The register.flush() method takes no arguments and returns a promise that resolves when the events have been successfully persisted to the event store. For example:

public async handle(register: Register): Promise<void> {
// Do some work...
register.events(new OrderConfirmed(this.orderID))
await register.flush()
const mailID = await sendConfirmationEmail(this.orderID)
register.events(new MailSent(this.orderID, mailID))
// Do more work...
}

In this example, we're calling register.flush() after registering an OrderConfirmed event to ensure that it's persisted to the event store before continuing with the rest of the handler logic. In this way, even if an error happens while sending the confirmation email, the order will be persisted.

Access the current signed in user

When handling a command or event, you can use the injected Register object to access the currently signed-in user as well as any metadata included in their JWT token like their roles or other claims (the specific claims will depend on the specific auth provider used). To do this, you can use the currentUser property. This property is an instance of the UserEnvelope class, which has the following properties:

export interface UserEnvelope {
id?: string // An optional identifier of the user
username: string // The unique username of the current user
roles: Array<string> // The list of role names assigned to this user
claims: Record<string, unknown> // An object containing the claims included in the body of the JWT token
header?: Record<string, unknown> // An object containing the headers of the JWT token for further verification
}

For example, to access the username of the currently signed-in user, you can use the currentUser.username property:

public async handle(register: Register): Promise<void> {
console.log(`The currently signed-in user is ${register.currentUser?.username}`)
}

// Output: The currently signed-in user is john.doe

Command-specific features

The command handlers are executed as part of a GraphQL mutation request, so they have access to a few additional features that are specific to commands that can be used to access the request context or alter the HTTP response headers.

Access the request context

The request context is injected in the command handler as part of the register command and you can access it using the context property. This property is an instance of the ContextEnvelope interface, which has the following properties:

export interface ContextEnvelope {
/** Decoded request header and body */
request: {
headers: unknown
body: unknown
}
/** Provider-dependent raw request context object */
rawContext: unknown
}

The request property exposes a normalized version of the request headers and body that can be used regardless the provider. We recommend using this property instead of the rawContext property, as it will be more portable across providers.

The rawContext property exposes the full raw request context as it comes in the original request, so it will depend on the underlying provider used. For instance, in AWS, it will be a lambda context object, while in Azure it will be an Azure Functions context object.

Alter the HTTP response headers

Finally, you can use the responseHeaders property to alter the HTTP response headers that will be sent back to the client. This property is a plain Typescript object which is initialized with the default headers. You can add, remove or modify any of the headers by using the standard object methods:

public async handle(register: Register): Promise<void> {
register.responseHeaders['X-My-Header'] = 'My custom header'
register.responseHeaders['X-My-Other-Header'] = 'My other custom header'
delete register.responseHeaders['X-My-Other-Header']
}
- +

Advanced uses of the Register object

The Register object is a built-in object that is automatically injected by the framework into all command or event handlers to let users interact with the execution context. It can be used for a variety of purposes, including:

  • Registering events to be emitted at the end of the command or event handler
  • Manually flush the events to be persisted synchronously to the event store
  • Access the current signed in user, their roles and other claims included in their JWT token
  • In a command: Access the request context or alter the HTTP response headers

Registering events

When handling a command or event, you can use the Register object to register one or more events that will be emitted when the command or event handler is completed. Events are registered using the register.events() method, which takes one or more events as arguments. For example:

public async handle(register: Register): Promise<void> {
// Do some work...
register.events(new OrderConfirmed(this.orderID))
// Do more work...
}

In this example, we're registering an OrderConfirmed event to be persisted to the event store when the handler finishes. You can also register multiple events by passing them as separate arguments to the register.events() method:

public async handle(register: Register): Promise<void> {
// Do some work...
register.events(
new OrderConfirmed(this.orderID),
new OrderShipped(this.orderID)
)
// Do more work...
}

It's worth noting that events registered with register.events() aren't immediately persisted to the event store. Instead, they're stored in memory until the command or event handler finishes executing. To force the events to be persisted immediately, you can call the register.flush() method that is described in the next section.

Manually flush the events

As mentioned in the previous section, events registered with register.events() aren't immediately persisted to the event store. Instead, they're stored in memory until the command or event handler finishes its execution, but this doesn't work in all situations, sometimes it's useful to store partial updates of a longer process, and some scenarios could accept partial successes. To force the events to be persisted and wait for the database to confirm the write, you can use the register.flush() method.

The register.flush() method takes no arguments and returns a promise that resolves when the events have been successfully persisted to the event store. For example:

public async handle(register: Register): Promise<void> {
// Do some work...
register.events(new OrderConfirmed(this.orderID))
await register.flush()
const mailID = await sendConfirmationEmail(this.orderID)
register.events(new MailSent(this.orderID, mailID))
// Do more work...
}

In this example, we're calling register.flush() after registering an OrderConfirmed event to ensure that it's persisted to the event store before continuing with the rest of the handler logic. In this way, even if an error happens while sending the confirmation email, the order will be persisted.

Access the current signed in user

When handling a command or event, you can use the injected Register object to access the currently signed-in user as well as any metadata included in their JWT token like their roles or other claims (the specific claims will depend on the specific auth provider used). To do this, you can use the currentUser property. This property is an instance of the UserEnvelope class, which has the following properties:

export interface UserEnvelope {
id?: string // An optional identifier of the user
username: string // The unique username of the current user
roles: Array<string> // The list of role names assigned to this user
claims: Record<string, unknown> // An object containing the claims included in the body of the JWT token
header?: Record<string, unknown> // An object containing the headers of the JWT token for further verification
}

For example, to access the username of the currently signed-in user, you can use the currentUser.username property:

public async handle(register: Register): Promise<void> {
console.log(`The currently signed-in user is ${register.currentUser?.username}`)
}

// Output: The currently signed-in user is john.doe

Command-specific features

The command handlers are executed as part of a GraphQL mutation request, so they have access to a few additional features that are specific to commands that can be used to access the request context or alter the HTTP response headers.

Access the request context

The request context is injected in the command handler as part of the register command and you can access it using the context property. This property is an instance of the ContextEnvelope interface, which has the following properties:

export interface ContextEnvelope {
/** Decoded request header and body */
request: {
headers: unknown
body: unknown
}
/** Provider-dependent raw request context object */
rawContext: unknown
}

The request property exposes a normalized version of the request headers and body that can be used regardless the provider. We recommend using this property instead of the rawContext property, as it will be more portable across providers.

The rawContext property exposes the full raw request context as it comes in the original request, so it will depend on the underlying provider used. For instance, in AWS, it will be a lambda context object, while in Azure it will be an Azure Functions context object.

Alter the HTTP response headers

Finally, you can use the responseHeaders property to alter the HTTP response headers that will be sent back to the client. This property is a plain Typescript object which is initialized with the default headers. You can add, remove or modify any of the headers by using the standard object methods:

public async handle(register: Register): Promise<void> {
register.responseHeaders['X-My-Header'] = 'My custom header'
register.responseHeaders['X-My-Other-Header'] = 'My other custom header'
delete register.responseHeaders['X-My-Other-Header']
}
+ \ No newline at end of file diff --git a/going-deeper/rockets/index.html b/going-deeper/rockets/index.html index 710a99f2a..d46da75d3 100644 --- a/going-deeper/rockets/index.html +++ b/going-deeper/rockets/index.html @@ -6,13 +6,13 @@ Extending Booster with Rockets! | Booster Framework - +
-

Extending Booster with Rockets!

You can extend Booster by creating rockets (Booster Framework extensions). A rocket is just a node package that implements the public Booster rocket interfaces. You can use them for:

  1. Extend your infrastructure: You can write a rocket that adds provider resources to your application stack.
  2. Runtime extensions: Add new annotations and interfaces, which combined with infrastructure extensions, could implement new abstractions on top of highly requested use cases.

If you want to create a rocket that supports several cloud providers or want to provide extra decorators and functionality on top of the infrastructure extensions, you'll probably need to distribute it as a set of separate packages. In this scenario we recommend using a monorepo management tool like Microsoft Rush to maintail them all together in a single repository, but this is not a requirement. Your packages will work perfectly fine if you maintain them in separate repositories.

Create an Infrastructure Rocket package to extend the default Booster-provided infrastructure

A rocket is an npm package that extends your current Booster architecture. The structure is simple, and it mainly has 2 methods: mountStack and unmountStack. We'll explain what they are shortly.

Infrastructure Rocket interfaces are provider-dependant because each provider defines their own way to manage context, so Infrastructure Rockets must import the corresponding booster infrastructure package for their chosen provider:

  • For AWS: @boostercloud/framework-provider-aws-infrastructure
  • For Azure: @boostercloud/framework-provider-azure-infrastructure
  • For Local (dev environment): @boostercloud/framework-provider-local-infrastructure

Notice that, as the only thing you'll need from that package is the InfrastructureRocket interface, it is preferable to import it as a dev dependency to avoid including such a big package in your deployed lambdas.

So let's start by creating a new package and adding the provider-depdendent dependency as well as the typescript and @boostercloud/framework-types packages:

mkdir rocket-your-rocket-name-aws-infrastructure
cd rocket-your-rocket-name-aws-infrastructure
npm init
...
npm install --save-dev @boostercloud/framework-provider-aws-infrastructure @boostercloud/framework-types typescript

In the case of AWS we use the AWS CDK for TypeScript, so you'll also need to import the AWS CDK package:

npm install --save-dev @aws-cdk/core

The basic structure of an Infrastructure Rocket project is quite simple as you can see here:

rocket-your-rocket-name-aws-infrastructure
├── package.json
├── src
├── index.ts
└── your-main-class.ts

<your-main-class>.ts can be named as you want and this is where we define the mountStack and unmount methods.

import { RocketUtils } from '@boostercloud/framework-provider-aws-infrastructure'
import { BoosterConfig } from '@boostercloud/framework-types'
import { Stack } from '@aws-cdk/core'
import { YourRocketParams } from '.'

export class YourMainClass {
public static mountStack(params: YourRocketParams, stack: Stack, config: BoosterConfig): void {
/* CDK code to expand your Booster infrastructure */
}
public static unmountStack(params: YourRocketParams, utils: RocketUtils): void {
/* Optional code that runs before removing the stack */
}
}

Let's look in more detail these two special functions:

  • mountStack: Whenever we are deploying our Booster application (boost deploy) this method will also be run. It receives two params:

    • stack: An initialized AWS CDK stack that you can use to add new resources. Check out the Stack API in the official CDK documentation. This is the same stack instance that Booster uses to deploy its resources, so your resources will automatically be deployed along with the Booster's ones on the same stack.
    • config: It includes properties of the Booster project (e.g. project name) that come in handy for your rocket.
  • unmountStack: This function executes when you run the boost nuke command, just before starting the deletion of the cloud resources. When you nuke your Booster application, resources added by your rocket are automatically destroyed with the rest of the application stack. However, in certain cases, you may need extra steps during the deletion process. The unmountStack function serves this purpose. For example, in AWS, you must first empty any S3 buckets before deleting your stack. You can achieve this within the unmountStack method.

In addition to your main rocket class, you'll need an index.ts file that default exports an object that conforms to the InfrastructureRocket interface:

export interface InfrastructureRocket {
mountStack: (stack: Stack, config: BoosterConfig) => void
unmountStack?: (utils: RocketUtils) => void
}

You'll have to implement a default exported function that accepts a parameters object and returns an initialized InfrastructureRocket object:

import { InfrastructureRocket } from '@boostercloud/framework-provider-aws-infrastructure'
import { YourMainClass } from './your-main-class';

export interface YourRocketParams {
param1: string
}

const YourRocketInitializator = (params: YourRocketParams): InfrastructureRocket => ({
mountStack: SomePrivateObject.mountStack.bind(null, params),
unmountStack: SomePrivateObject.unmountStack.bind(null, params),
})

export default YourRocketInitializator

Note that Infrastructure Rockets must not be part of the Booster application code to prevent including the CDK and other unnecessary dependencies in the deployed lambdas. This is due to strict code size restrictions on most platforms. To address this, Infrastructure Rockets are dynamically loaded by Booster, using package names as strings in the application config file:

src/config/production.ts:

Booster.configure('development', (config: BoosterConfig): void => {
config.appName = 'my-store'
config.providerPackage = '@boostercloud/framework-provider-aws'
config.rockets = [
{
packageName: 'rocket-your-rocket-name-aws-infrastructure', // Your infrastructure rocket package name
parameters: {
// A custom object with the parameters needed by your infrastructure rocket initializer
hello: 'world',
},
},
]
})

Your rocket implementation will have access to the stack (CDK in AWS or Terraform in Azure) just after Booster has finished to add all its default resources, so while the most common scenario to implement a rocket is to create additional resources, it's also possible to inspect or alter the Booster stack. If you're considering creating and maintaining your own fork of one of the default provider runtime implementations, it could be easier to create a rocket instead.

Provide new abtractions with custom decorators

Rockets can be utilized to extend the Booster framework by providing additional decorators that offer new abstractions. When creating a decorator as part of your rocket, you should deliver it as a package that, once compiled, does not have any infrastructure dependencies, so if your rocket provides both infrastructure and runtime extensions, it's advisable to deliver it as a pair of packages or more.

A common pattern when creating decorators for Booster is to use a singleton object to store metadata about the decorated structures. This singleton object stores data generated during the decorator's execution, which can then be accessed from other parts of the user's project, the rocket's infrastructure package or even other rockets. This data can be used during deployment to generate extra tables, endpoints, or other resources.

To create a new custom decorator for the Booster framework with singleton storage, follow these steps:

  1. Create a new npm package for your rocket. This package should not have any infrastructure dependencies once compiled.
$ mkdir my-booster-rocket
$ cd my-booster-rocket
$ npm init
  1. Add typescript as a dependency
$ npm install typescript --save-dev
  1. Create a src directory to hold your decorator code:
$ mkdir src
  1. Inside the src directory, create a new TypeScript file for your singleton object, e.g., RocketSingleton.ts:
$ touch src/RocketSingleton.ts
  1. Implement your singleton object to store your metadata, for instance, a list of special classes that we will "mark" for later:
// src/RocketSingleton.ts
export class RocketSingleton {
public static specialClasses: Function[] = [];

private constructor() {}

public static addSpecialClass(target: Function): void {
RocketSingleton.specialClasses.push(target)
}
}
  1. Create a new TypeScript file for your custom decorator, e.g., MyCustomDecorator.ts:
$ touch src/MyCustomDecorator.ts
  1. Implement your custom decorator using the singleton object:
// src/MyCustomDecorator.ts
import { RocketSingleton } from "./RocketSingleton"

export function MyCustomDecorator(): (target: Function) => void {
return (target: Function) => {
// Implement your decorator logic here.
console.log(`MyCustomDecorator applied on ${target.name}`)
RocketSingleton.addSpecialClass(target)
}
}
  1. Export your decorator from the package's entry point, e.g., index.ts:
// src/index.ts
export * from './MyCustomDecorator';
export * from './RocketSingleton';

Now you have a custom decorator that can be used within the Booster framework. Users can install your rocket package and use the decorator in their Booster applications:

$ npm install my-booster-rocket
// src/MySpecialClass.ts
import { MyCustomDecorator, RocketSingleton } from 'my-booster-rocket';

@MyCustomDecorator()
class MySpecialClass {
// Application logic here
}

console.log(RocketSingleton.specialClasses) // [ [Function: MySpecialClass] ]

This example demonstrates how to create a custom decorator with a singleton object for storing data and package it as a rocket for use with the Booster framework. Following this pattern will allow you to extend Booster with new abstractions and provide additional functionality for users. The singleton object can be used to store and retrieve data across different parts of the user's project, enabling features such as generating extra tables or endpoints during deployment. This approach ensures a consistent and flexible way to extend the Booster framework while maintaining ease of use for developers.

Naming recommendations

There are no restrictions on how you name your rocket packages, but we propose the following naming convention to make it easier to find your extensions in the vast npm library and find related packages (code and infrastructure extensions cannot be distributed in the same package).

  • rocket-{rocket-name}-{provider}: A rocket that adds runtime functionality or init scripts. This code will be deployed along with your application code to the lambdas.
  • rocket-{rocket-name}-{provider}-infrastructure: A rocket that provides infrastructure extensions or implements deploy hooks. This code will only be used on developer's or CI/CD systems machines and won't be deployed to lambda with the rest of the application code.

Notice that some functionalities, for instance an S3 uploader, might require both runtime and infrastructure extensions. In these cases, the convention is to use the same name rocket-name and add the suffix -infrastructure to the infrastructure rocket. It's recommended, but not required, to manage these dependent packages in a monorepo and ensure that the versions match on each release.

If you want to support the same functionality in several providers, it could be handy to also have a package named rocket-{rocket-name}-{provider}-core where you can have cross-provider code that you can use from all the provider-specific implementations. For instance, a file uploader rocket that supports both AWS and Azure could have an structure like this:

  • rocket-file-uploader-core: Defines abstract decorators and interfaces to handle uploaded files.
  • rocket-file-uploader-aws: Implements the API calls to S3 to get the uploaded files.
  • rocket-file-uploader-aws-infrastructure: Adds a dedicated S3 bucket.
  • rocket-file-uploader-azure: Implements the API calls to Azure Storage to get the uploaded files.
  • rocket-file-uploader-azure-infrastructure: Configures file storage.

Booster Rockets list

Here you can check out the official Booster Rockets developed at this time:

- +

Extending Booster with Rockets!

You can extend Booster by creating rockets (Booster Framework extensions). A rocket is just a node package that implements the public Booster rocket interfaces. You can use them for:

  1. Extend your infrastructure: You can write a rocket that adds provider resources to your application stack.
  2. Runtime extensions: Add new annotations and interfaces, which combined with infrastructure extensions, could implement new abstractions on top of highly requested use cases.

If you want to create a rocket that supports several cloud providers or want to provide extra decorators and functionality on top of the infrastructure extensions, you'll probably need to distribute it as a set of separate packages. In this scenario we recommend using a monorepo management tool like Microsoft Rush to maintail them all together in a single repository, but this is not a requirement. Your packages will work perfectly fine if you maintain them in separate repositories.

Create an Infrastructure Rocket package to extend the default Booster-provided infrastructure

A rocket is an npm package that extends your current Booster architecture. The structure is simple, and it mainly has 2 methods: mountStack and unmountStack. We'll explain what they are shortly.

Infrastructure Rocket interfaces are provider-dependant because each provider defines their own way to manage context, so Infrastructure Rockets must import the corresponding booster infrastructure package for their chosen provider:

  • For AWS: @boostercloud/framework-provider-aws-infrastructure
  • For Azure: @boostercloud/framework-provider-azure-infrastructure
  • For Local (dev environment): @boostercloud/framework-provider-local-infrastructure

Notice that, as the only thing you'll need from that package is the InfrastructureRocket interface, it is preferable to import it as a dev dependency to avoid including such a big package in your deployed lambdas.

So let's start by creating a new package and adding the provider-depdendent dependency as well as the typescript and @boostercloud/framework-types packages:

mkdir rocket-your-rocket-name-aws-infrastructure
cd rocket-your-rocket-name-aws-infrastructure
npm init
...
npm install --save-dev @boostercloud/framework-provider-aws-infrastructure @boostercloud/framework-types typescript

In the case of AWS we use the AWS CDK for TypeScript, so you'll also need to import the AWS CDK package:

npm install --save-dev @aws-cdk/core

The basic structure of an Infrastructure Rocket project is quite simple as you can see here:

rocket-your-rocket-name-aws-infrastructure
├── package.json
├── src
├── index.ts
└── your-main-class.ts

<your-main-class>.ts can be named as you want and this is where we define the mountStack and unmount methods.

import { RocketUtils } from '@boostercloud/framework-provider-aws-infrastructure'
import { BoosterConfig } from '@boostercloud/framework-types'
import { Stack } from '@aws-cdk/core'
import { YourRocketParams } from '.'

export class YourMainClass {
public static mountStack(params: YourRocketParams, stack: Stack, config: BoosterConfig): void {
/* CDK code to expand your Booster infrastructure */
}
public static unmountStack(params: YourRocketParams, utils: RocketUtils): void {
/* Optional code that runs before removing the stack */
}
}

Let's look in more detail these two special functions:

  • mountStack: Whenever we are deploying our Booster application (boost deploy) this method will also be run. It receives two params:

    • stack: An initialized AWS CDK stack that you can use to add new resources. Check out the Stack API in the official CDK documentation. This is the same stack instance that Booster uses to deploy its resources, so your resources will automatically be deployed along with the Booster's ones on the same stack.
    • config: It includes properties of the Booster project (e.g. project name) that come in handy for your rocket.
  • unmountStack: This function executes when you run the boost nuke command, just before starting the deletion of the cloud resources. When you nuke your Booster application, resources added by your rocket are automatically destroyed with the rest of the application stack. However, in certain cases, you may need extra steps during the deletion process. The unmountStack function serves this purpose. For example, in AWS, you must first empty any S3 buckets before deleting your stack. You can achieve this within the unmountStack method.

In addition to your main rocket class, you'll need an index.ts file that default exports an object that conforms to the InfrastructureRocket interface:

export interface InfrastructureRocket {
mountStack: (stack: Stack, config: BoosterConfig) => void
unmountStack?: (utils: RocketUtils) => void
}

You'll have to implement a default exported function that accepts a parameters object and returns an initialized InfrastructureRocket object:

import { InfrastructureRocket } from '@boostercloud/framework-provider-aws-infrastructure'
import { YourMainClass } from './your-main-class';

export interface YourRocketParams {
param1: string
}

const YourRocketInitializator = (params: YourRocketParams): InfrastructureRocket => ({
mountStack: SomePrivateObject.mountStack.bind(null, params),
unmountStack: SomePrivateObject.unmountStack.bind(null, params),
})

export default YourRocketInitializator

Note that Infrastructure Rockets must not be part of the Booster application code to prevent including the CDK and other unnecessary dependencies in the deployed lambdas. This is due to strict code size restrictions on most platforms. To address this, Infrastructure Rockets are dynamically loaded by Booster, using package names as strings in the application config file:

src/config/production.ts:

Booster.configure('development', (config: BoosterConfig): void => {
config.appName = 'my-store'
config.providerPackage = '@boostercloud/framework-provider-aws'
config.rockets = [
{
packageName: 'rocket-your-rocket-name-aws-infrastructure', // Your infrastructure rocket package name
parameters: {
// A custom object with the parameters needed by your infrastructure rocket initializer
hello: 'world',
},
},
]
})

Your rocket implementation will have access to the stack (CDK in AWS or Terraform in Azure) just after Booster has finished to add all its default resources, so while the most common scenario to implement a rocket is to create additional resources, it's also possible to inspect or alter the Booster stack. If you're considering creating and maintaining your own fork of one of the default provider runtime implementations, it could be easier to create a rocket instead.

Provide new abtractions with custom decorators

Rockets can be utilized to extend the Booster framework by providing additional decorators that offer new abstractions. When creating a decorator as part of your rocket, you should deliver it as a package that, once compiled, does not have any infrastructure dependencies, so if your rocket provides both infrastructure and runtime extensions, it's advisable to deliver it as a pair of packages or more.

A common pattern when creating decorators for Booster is to use a singleton object to store metadata about the decorated structures. This singleton object stores data generated during the decorator's execution, which can then be accessed from other parts of the user's project, the rocket's infrastructure package or even other rockets. This data can be used during deployment to generate extra tables, endpoints, or other resources.

To create a new custom decorator for the Booster framework with singleton storage, follow these steps:

  1. Create a new npm package for your rocket. This package should not have any infrastructure dependencies once compiled.
$ mkdir my-booster-rocket
$ cd my-booster-rocket
$ npm init
  1. Add typescript as a dependency
$ npm install typescript --save-dev
  1. Create a src directory to hold your decorator code:
$ mkdir src
  1. Inside the src directory, create a new TypeScript file for your singleton object, e.g., RocketSingleton.ts:
$ touch src/RocketSingleton.ts
  1. Implement your singleton object to store your metadata, for instance, a list of special classes that we will "mark" for later:
// src/RocketSingleton.ts
export class RocketSingleton {
public static specialClasses: Function[] = [];

private constructor() {}

public static addSpecialClass(target: Function): void {
RocketSingleton.specialClasses.push(target)
}
}
  1. Create a new TypeScript file for your custom decorator, e.g., MyCustomDecorator.ts:
$ touch src/MyCustomDecorator.ts
  1. Implement your custom decorator using the singleton object:
// src/MyCustomDecorator.ts
import { RocketSingleton } from "./RocketSingleton"

export function MyCustomDecorator(): (target: Function) => void {
return (target: Function) => {
// Implement your decorator logic here.
console.log(`MyCustomDecorator applied on ${target.name}`)
RocketSingleton.addSpecialClass(target)
}
}
  1. Export your decorator from the package's entry point, e.g., index.ts:
// src/index.ts
export * from './MyCustomDecorator';
export * from './RocketSingleton';

Now you have a custom decorator that can be used within the Booster framework. Users can install your rocket package and use the decorator in their Booster applications:

$ npm install my-booster-rocket
// src/MySpecialClass.ts
import { MyCustomDecorator, RocketSingleton } from 'my-booster-rocket';

@MyCustomDecorator()
class MySpecialClass {
// Application logic here
}

console.log(RocketSingleton.specialClasses) // [ [Function: MySpecialClass] ]

This example demonstrates how to create a custom decorator with a singleton object for storing data and package it as a rocket for use with the Booster framework. Following this pattern will allow you to extend Booster with new abstractions and provide additional functionality for users. The singleton object can be used to store and retrieve data across different parts of the user's project, enabling features such as generating extra tables or endpoints during deployment. This approach ensures a consistent and flexible way to extend the Booster framework while maintaining ease of use for developers.

Naming recommendations

There are no restrictions on how you name your rocket packages, but we propose the following naming convention to make it easier to find your extensions in the vast npm library and find related packages (code and infrastructure extensions cannot be distributed in the same package).

  • rocket-{rocket-name}-{provider}: A rocket that adds runtime functionality or init scripts. This code will be deployed along with your application code to the lambdas.
  • rocket-{rocket-name}-{provider}-infrastructure: A rocket that provides infrastructure extensions or implements deploy hooks. This code will only be used on developer's or CI/CD systems machines and won't be deployed to lambda with the rest of the application code.

Notice that some functionalities, for instance an S3 uploader, might require both runtime and infrastructure extensions. In these cases, the convention is to use the same name rocket-name and add the suffix -infrastructure to the infrastructure rocket. It's recommended, but not required, to manage these dependent packages in a monorepo and ensure that the versions match on each release.

If you want to support the same functionality in several providers, it could be handy to also have a package named rocket-{rocket-name}-{provider}-core where you can have cross-provider code that you can use from all the provider-specific implementations. For instance, a file uploader rocket that supports both AWS and Azure could have an structure like this:

  • rocket-file-uploader-core: Defines abstract decorators and interfaces to handle uploaded files.
  • rocket-file-uploader-aws: Implements the API calls to S3 to get the uploaded files.
  • rocket-file-uploader-aws-infrastructure: Adds a dedicated S3 bucket.
  • rocket-file-uploader-azure: Implements the API calls to Azure Storage to get the uploaded files.
  • rocket-file-uploader-azure-infrastructure: Configures file storage.

Booster Rockets list

Here you can check out the official Booster Rockets developed at this time:

+ \ No newline at end of file diff --git a/going-deeper/rockets/rocket-backup-booster/index.html b/going-deeper/rockets/rocket-backup-booster/index.html index a96da7f14..811194596 100644 --- a/going-deeper/rockets/rocket-backup-booster/index.html +++ b/going-deeper/rockets/rocket-backup-booster/index.html @@ -6,13 +6,13 @@ Backup Booster Rocket | Booster Framework - +
-

Backup Booster Rocket

This rocket adds backup capabilities to your Booster DynamoDB tables using the point-in-time recovery feature or the on-demand one.

After enabling this rocket, all DynamoDB tables generated by Booster Framework (read-models, event-stores, subscription store...) will be automatically backed up.

info

GitHub Repo

Disclaimer: As of the date of developing this rocket, and in the latest CDK version (1.85), the export to S3 feature is not available. That means that if you want to get advantage of this feature, you'll have to do it by using the AWS console or AWS CLI, until we can update this rocket with that feature.

Usage

Install this package as a dev dependency in your Booster project:

npm install --save-dev @boostercloud/rocket-backup-aws-infrastructure

In your Booster config file, pass a RocketDescriptor array to the AWS' Provider initializer configuring the backup rocket:

src/config/config.ts
import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'
import * as AWS from '@boostercloud/framework-provider-aws'

Booster.configure('development', (config: BoosterConfig): void => {
config.appName = 'my-store'
config.provider = Provider([{
packageName: '@boostercloud/rocket-backup-aws-infrastructure',
parameters: {
backupType: 'ON_DEMAND', // or 'POINT_IN_TIME'
// onDemandBackupRules is optional and uses cron notation. Cron params are all optional too.
onDemandBackupRules: {
minute: '30',
hour: '3',
day: '15',
month: '5',
weekDay: '4', // Weekday is also supported, but can't be set along with 'day' parameter
year: '2077',
}
}
}])
})
- +

Backup Booster Rocket

This rocket adds backup capabilities to your Booster DynamoDB tables using the point-in-time recovery feature or the on-demand one.

After enabling this rocket, all DynamoDB tables generated by Booster Framework (read-models, event-stores, subscription store...) will be automatically backed up.

info

GitHub Repo

Disclaimer: As of the date of developing this rocket, and in the latest CDK version (1.85), the export to S3 feature is not available. That means that if you want to get advantage of this feature, you'll have to do it by using the AWS console or AWS CLI, until we can update this rocket with that feature.

Usage

Install this package as a dev dependency in your Booster project:

npm install --save-dev @boostercloud/rocket-backup-aws-infrastructure

In your Booster config file, pass a RocketDescriptor array to the AWS' Provider initializer configuring the backup rocket:

src/config/config.ts
import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'
import * as AWS from '@boostercloud/framework-provider-aws'

Booster.configure('development', (config: BoosterConfig): void => {
config.appName = 'my-store'
config.provider = Provider([{
packageName: '@boostercloud/rocket-backup-aws-infrastructure',
parameters: {
backupType: 'ON_DEMAND', // or 'POINT_IN_TIME'
// onDemandBackupRules is optional and uses cron notation. Cron params are all optional too.
onDemandBackupRules: {
minute: '30',
hour: '3',
day: '15',
month: '5',
weekDay: '4', // Weekday is also supported, but can't be set along with 'day' parameter
year: '2077',
}
}
}])
})
+ \ No newline at end of file diff --git a/going-deeper/rockets/rocket-file-uploads/index.html b/going-deeper/rockets/rocket-file-uploads/index.html index 429745a4f..7cbd1ee34 100644 --- a/going-deeper/rockets/rocket-file-uploads/index.html +++ b/going-deeper/rockets/rocket-file-uploads/index.html @@ -6,14 +6,14 @@ File Uploads Rocket | Booster Framework - +

File Uploads Rocket

This package is a configurable rocket to add a storage API to your Booster applications.

Supported Providers

  • Azure Provider
  • AWS Provider
  • Local Provider

Overview

This rocket provides some methods to access files stores in your cloud provider:

  • presignedPut: Returns a presigned put url and the necessary form params. With this url files can be uploaded directly to your provider.
  • presignedGet: Returns a presigned get url to download a file. With this url files can be downloaded directly from your provider.
  • list: Returns a list of files stored in the provider.
  • deleteFile: Removes a file from a directory (only supported in AWS at the moment).

These methods may be used from a Command in your project secured via JWT Token. -This rocket also provides a Booster Event each time a file is uploaded.

Usage

Install needed dependency packages:

npm install --save @boostercloud/rocket-file-uploads-core @boostercloud/rocket-file-uploads-types
npm install --save @boostercloud/rocket-file-uploads-azure

Also, you will need a devDependency in your project:

npm install --save-dev @boostercloud/rocket-file-uploads-azure-infrastructure

In your Booster config file, configure your BoosterRocketFiles:

src/config/config.ts
import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'
import { BoosterRocketFiles } from '@boostercloud/rocket-file-uploads-core'
import { RocketFilesUserConfiguration } from '@boostercloud/rocket-file-uploads-types'

const rocketFilesConfigurationDefault: RocketFilesUserConfiguration = {
storageName: 'STORAGE_NAME',
containerName: 'CONTAINER_NAME',
directories: ['DIRECTORY_1', 'DIRECTORY_2'],
}

const rocketFilesConfigurationCms: RocketFilesUserConfiguration = {
storageName: 'cmsst',
containerName: 'rocketfiles',
directories: ['cms1', 'cms2'],
}

Booster.configure('production', (config: BoosterConfig): void => {
config.appName = 'TEST_APP_NAME'
config.providerPackage = '@boostercloud/framework-provider-azure'
config.rockets = [
new BoosterRocketFiles(config, [rocketFilesConfigurationDefault, rocketFilesConfigurationCms]).rocketForAzure(),
]
})

info

Available parameters are:

  • storageName: Name of the storage repository.
  • containerName: Directories container.
  • directories: A list of folders where the files will be stored.

The structure created will be:

├── storageName
│ ├── containerName
│ │ ├── directory

NOTE: Azure Provider will use storageName as the Storage Account Name.

Rocket Methods Usage

Presigned Put

Create a command in your application and call the presignedPut method on the FileHandler class with the directory and filename you want to upload on the storage.

The storageName parameter is optional. It will use the first storage if undefined.

src/commands/file-upload-put.ts
import { Booster, Command } from '@boostercloud/framework-core'
import { Register } from '@boostercloud/framework-types'
import { FileHandler } from '@boostercloud/rocket-file-uploads-core'

@Command({
authorize: 'all',
})
export class FileUploadPut {
public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}

public static async handle(command: FileUploadPut, register: Register): Promise<string> {
const boosterConfig = Booster.config
const fileHandler = new FileHandler(boosterConfig, command.storageName)
return await fileHandler.presignedPut(command.directory, command.fileName)
}
}

GraphQL Mutation:

mutation {
FileUploadPut(input: {
storageName: "clientst",
directory: "client1",
fileName: "myfile.txt"
}
)
}

Azure Response:

{
"data": {
"FileUploadPut": "https://clientst.blob.core.windows.net/rocketfiles/client1/myfile.txt?<SAS>"
}
}
Presigned Get

Create a command in your application and call the presignedGet method on the FileHandler class with the directory and filename you want to get on the storage.

The storageName parameter is optional. It will use the first storage if undefined.

src/commands/file-upload-get.ts
import { Booster, Command } from '@boostercloud/framework-core'
import { Register } from '@boostercloud/framework-types'
import { FileHandler } from '@boostercloud/rocket-file-uploads-core'

@Command({
authorize: 'all',
})
export class FileUploadGet {
public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}

public static async handle(command: FileUploadGet, register: Register): Promise<string> {
const boosterConfig = Booster.config
const fileHandler = new FileHandler(boosterConfig, command.storageName)
return await fileHandler.presignedGet(command.directory, command.fileName)
}
}

GraphQL Mutation:

mutation {
FileUploadGet(input: {
storageName: "clientst",
directory: "client1",
fileName: "myfile.txt"
}
)
}

Azure Response:

{
"data": {
"FileUploadGet": "https://clientst.blob.core.windows.net/rocketfiles/folder01%2Fmyfile.txt?<SAS>"
}
}
List

Create a command in your application and call the list method on the FileHandler class with the directory you want to get the info and return the formatted results.

The storageName parameter is optional. It will use the first storage if undefined.

src/commands/file-upload-list.ts
import { Booster, Command } from '@boostercloud/framework-core'
import { Register } from '@boostercloud/framework-types'
import { FileHandler } from '@boostercloud/rocket-file-uploads-core'
import { ListItem } from '@boostercloud/rocket-file-uploads-types'

@Command({
authorize: 'all',
})
export class FileUploadList {
public constructor(readonly directory: string, readonly storageName?: string) {}

public static async handle(command: FileUploadList, register: Register): Promise<Array<ListItem>> {
const boosterConfig = Booster.config
const fileHandler = new FileHandler(boosterConfig, command.storageName)
return await fileHandler.list(command.directory)
}
}

GraphQL Mutation:

mutation {
FileUploadList(input: {
storageName: "clientst",
directory: "client1"
}
)
}

Response:

{
"data": {
"FileUploadList": [
{
"name": "client1/myfile.txt",
"properties": {
"createdOn": "2022-10-26T05:40:47.000Z",
"lastModified": "2022-10-26T05:40:47.000Z",
"contentLength": 6,
"contentType": "text/plain"
}
}
]
}
}
Delete File
Currently, the option to delete a file is only available on AWS. If this is a feature you were looking for, please let us know on Discord. Alternatively, you can implement this feature and submit a pull request on GitHub for this Rocket!

Azure Roles

info

Starting at version 0.31.0 this Rocket use Managed Identities instead of Connection Strings. Please, check that you have the required permissions to assign roles https://learn.microsoft.com/en-us/azure/role-based-access-control/role-assignments-portal-managed-identity#prerequisites

For uploading files to Azure you need the Storage Blob Data Contributor role. This can be assigned to a user using the portal or with the next scripts:

First, check if you have the correct permissions:

ACCOUNT_NAME="<STORAGE ACCOUNT NAME>"
CONTAINER_NAME="<CONTAINER NAME>"

# use this to test if you have the correct permissions
az storage blob exists --account-name $ACCOUNT_NAME `
--container-name $CONTAINER_NAME `
--name blob1.txt --auth-mode login

If you don't have it, then run this script as admin:

ACCOUNT_NAME="<STORAGE ACCOUNT NAME>"
CONTAINER_NAME="<CONTAINER NAME>"

OBJECT_ID=$(az ad user list --query "[?mailNickname=='<YOUR MAIL NICK NAME>'].objectId" -o tsv)
STORAGE_ID=$(az storage account show -n $ACCOUNT_NAME --query id -o tsv)

az role assignment create \
--role "Storage Blob Data Contributor" \
--assignee $OBJECT_ID \
--scope "$STORAGE_ID/blobServices/default/containers/$CONTAINER_NAME"

Events

For each uploaded file a new event will be automatically generated and properly reduced on the entity UploadedFileEntity.

The event will look like this:

{
"version": 1,
"kind": "snapshot",
"superKind": "domain",
"requestID": "xxx",
"entityID": "xxxx",
"entityTypeName": "UploadedFileEntity",
"typeName": "UploadedFileEntity",
"value": {
"id": "xxx",
"metadata": {
// A bunch of fields (depending on Azure or AWS)
}
},
"createdAt": "2022-10-26T10:23:36.562Z",
"snapshottedEventCreatedAt": "2022-10-26T10:23:32.34Z",
"entityTypeName_entityID_kind": "UploadedFileEntity-xxx-b842-x-8975-xx-snapshot",
"id": "x-x-x-x-x",
"_rid": "x==",
"_self": "dbs/x==/colls/x=/docs/x==/",
"_etag": "\"x-x-0500-0000-x\"",
"_attachments": "attachments/",
"_ts": 123456
}

TODOs

  • Add file deletion to Azure and Local (only supported in AWS at the moment).
  • Optional storage deletion when unmounting the stack.
  • Optional events, in case you don't want to store that information in the events-store.
  • When deleting a file, save a deletion event in the events-store. Only uploads are stored at the moment.
- +This rocket also provides a Booster Event each time a file is uploaded.

Usage

Install needed dependency packages:

npm install --save @boostercloud/rocket-file-uploads-core @boostercloud/rocket-file-uploads-types
npm install --save @boostercloud/rocket-file-uploads-azure

Also, you will need a devDependency in your project:

npm install --save-dev @boostercloud/rocket-file-uploads-azure-infrastructure

In your Booster config file, configure your BoosterRocketFiles:

src/config/config.ts
import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'
import { BoosterRocketFiles } from '@boostercloud/rocket-file-uploads-core'
import { RocketFilesUserConfiguration } from '@boostercloud/rocket-file-uploads-types'

const rocketFilesConfigurationDefault: RocketFilesUserConfiguration = {
storageName: 'STORAGE_NAME',
containerName: 'CONTAINER_NAME',
directories: ['DIRECTORY_1', 'DIRECTORY_2'],
}

const rocketFilesConfigurationCms: RocketFilesUserConfiguration = {
storageName: 'cmsst',
containerName: 'rocketfiles',
directories: ['cms1', 'cms2'],
}

Booster.configure('production', (config: BoosterConfig): void => {
config.appName = 'TEST_APP_NAME'
config.providerPackage = '@boostercloud/framework-provider-azure'
config.rockets = [
new BoosterRocketFiles(config, [rocketFilesConfigurationDefault, rocketFilesConfigurationCms]).rocketForAzure(),
]
})

info

Available parameters are:

  • storageName: Name of the storage repository.
  • containerName: Directories container.
  • directories: A list of folders where the files will be stored.

The structure created will be:

├── storageName
│ ├── containerName
│ │ ├── directory

NOTE: Azure Provider will use storageName as the Storage Account Name.

Rocket Methods Usage

Presigned Put

Create a command in your application and call the presignedPut method on the FileHandler class with the directory and filename you want to upload on the storage.

The storageName parameter is optional. It will use the first storage if undefined.

src/commands/file-upload-put.ts
import { Booster, Command } from '@boostercloud/framework-core'
import { Register } from '@boostercloud/framework-types'
import { FileHandler } from '@boostercloud/rocket-file-uploads-core'

@Command({
authorize: 'all',
})
export class FileUploadPut {
public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}

public static async handle(command: FileUploadPut, register: Register): Promise<string> {
const boosterConfig = Booster.config
const fileHandler = new FileHandler(boosterConfig, command.storageName)
return await fileHandler.presignedPut(command.directory, command.fileName)
}
}

GraphQL Mutation:

mutation {
FileUploadPut(input: {
storageName: "clientst",
directory: "client1",
fileName: "myfile.txt"
}
)
}

Azure Response:

{
"data": {
"FileUploadPut": "https://clientst.blob.core.windows.net/rocketfiles/client1/myfile.txt?<SAS>"
}
}
Presigned Get

Create a command in your application and call the presignedGet method on the FileHandler class with the directory and filename you want to get on the storage.

The storageName parameter is optional. It will use the first storage if undefined.

src/commands/file-upload-get.ts
import { Booster, Command } from '@boostercloud/framework-core'
import { Register } from '@boostercloud/framework-types'
import { FileHandler } from '@boostercloud/rocket-file-uploads-core'

@Command({
authorize: 'all',
})
export class FileUploadGet {
public constructor(readonly directory: string, readonly fileName: string, readonly storageName?: string) {}

public static async handle(command: FileUploadGet, register: Register): Promise<string> {
const boosterConfig = Booster.config
const fileHandler = new FileHandler(boosterConfig, command.storageName)
return await fileHandler.presignedGet(command.directory, command.fileName)
}
}

GraphQL Mutation:

mutation {
FileUploadGet(input: {
storageName: "clientst",
directory: "client1",
fileName: "myfile.txt"
}
)
}

Azure Response:

{
"data": {
"FileUploadGet": "https://clientst.blob.core.windows.net/rocketfiles/folder01%2Fmyfile.txt?<SAS>"
}
}
List

Create a command in your application and call the list method on the FileHandler class with the directory you want to get the info and return the formatted results.

The storageName parameter is optional. It will use the first storage if undefined.

src/commands/file-upload-list.ts
import { Booster, Command } from '@boostercloud/framework-core'
import { Register } from '@boostercloud/framework-types'
import { FileHandler } from '@boostercloud/rocket-file-uploads-core'
import { ListItem } from '@boostercloud/rocket-file-uploads-types'

@Command({
authorize: 'all',
})
export class FileUploadList {
public constructor(readonly directory: string, readonly storageName?: string) {}

public static async handle(command: FileUploadList, register: Register): Promise<Array<ListItem>> {
const boosterConfig = Booster.config
const fileHandler = new FileHandler(boosterConfig, command.storageName)
return await fileHandler.list(command.directory)
}
}

GraphQL Mutation:

mutation {
FileUploadList(input: {
storageName: "clientst",
directory: "client1"
}
)
}

Response:

{
"data": {
"FileUploadList": [
{
"name": "client1/myfile.txt",
"properties": {
"createdOn": "2022-10-26T05:40:47.000Z",
"lastModified": "2022-10-26T05:40:47.000Z",
"contentLength": 6,
"contentType": "text/plain"
}
}
]
}
}
Delete File
Currently, the option to delete a file is only available on AWS. If this is a feature you were looking for, please let us know on Discord. Alternatively, you can implement this feature and submit a pull request on GitHub for this Rocket!

Azure Roles

info

Starting at version 0.31.0 this Rocket use Managed Identities instead of Connection Strings. Please, check that you have the required permissions to assign roles https://learn.microsoft.com/en-us/azure/role-based-access-control/role-assignments-portal-managed-identity#prerequisites

For uploading files to Azure you need the Storage Blob Data Contributor role. This can be assigned to a user using the portal or with the next scripts:

First, check if you have the correct permissions:

ACCOUNT_NAME="<STORAGE ACCOUNT NAME>"
CONTAINER_NAME="<CONTAINER NAME>"

# use this to test if you have the correct permissions
az storage blob exists --account-name $ACCOUNT_NAME `
--container-name $CONTAINER_NAME `
--name blob1.txt --auth-mode login

If you don't have it, then run this script as admin:

ACCOUNT_NAME="<STORAGE ACCOUNT NAME>"
CONTAINER_NAME="<CONTAINER NAME>"

OBJECT_ID=$(az ad user list --query "[?mailNickname=='<YOUR MAIL NICK NAME>'].objectId" -o tsv)
STORAGE_ID=$(az storage account show -n $ACCOUNT_NAME --query id -o tsv)

az role assignment create \
--role "Storage Blob Data Contributor" \
--assignee $OBJECT_ID \
--scope "$STORAGE_ID/blobServices/default/containers/$CONTAINER_NAME"

Events

For each uploaded file a new event will be automatically generated and properly reduced on the entity UploadedFileEntity.

The event will look like this:

{
"version": 1,
"kind": "snapshot",
"superKind": "domain",
"requestID": "xxx",
"entityID": "xxxx",
"entityTypeName": "UploadedFileEntity",
"typeName": "UploadedFileEntity",
"value": {
"id": "xxx",
"metadata": {
// A bunch of fields (depending on Azure or AWS)
}
},
"createdAt": "2022-10-26T10:23:36.562Z",
"snapshottedEventCreatedAt": "2022-10-26T10:23:32.34Z",
"entityTypeName_entityID_kind": "UploadedFileEntity-xxx-b842-x-8975-xx-snapshot",
"id": "x-x-x-x-x",
"_rid": "x==",
"_self": "dbs/x==/colls/x=/docs/x==/",
"_etag": "\"x-x-0500-0000-x\"",
"_attachments": "attachments/",
"_ts": 123456
}

TODOs

  • Add file deletion to Azure and Local (only supported in AWS at the moment).
  • Optional storage deletion when unmounting the stack.
  • Optional events, in case you don't want to store that information in the events-store.
  • When deleting a file, save a deletion event in the events-store. Only uploads are stored at the moment.
+ \ No newline at end of file diff --git a/going-deeper/rockets/rocket-static-sites/index.html b/going-deeper/rockets/rocket-static-sites/index.html index 61179f59b..0cd6c9c8a 100644 --- a/going-deeper/rockets/rocket-static-sites/index.html +++ b/going-deeper/rockets/rocket-static-sites/index.html @@ -6,13 +6,13 @@ Static Sites Rocket | Booster Framework - +
-

Static Sites Rocket

This package is a configurable Booster rocket to add static site deployment to your Booster applications. It uploads your root.

Usage

Install this package as a dev dependency in your Booster project (It's a dev dependency because it's only used during deployment, but we don't want this code to be uploaded to the project lambdas)

npm install --save-dev @boostercloud/rocket-static-sites-aws-infrastructure

In your Booster config file, pass a RocketDescriptor in the config.rockets array to configuring the static site rocket:

import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'

Booster.configure('development', (config: BoosterConfig): void => {
config.appName = 'my-store'
config.rockets = [
{
packageName: '@boostercloud/rocket-static-sites-aws-infrastructure',
parameters: {
bucketName: 'test-bucket-name', // Required
rootPath: './frontend/dist', // Defaults to ./public
indexFile: 'main.html', // File to render when users access the CLoudFormation URL. Defaults to index.html
errorFile: 'error.html', // File to render when there's an error. Defaults to 404.html
}
},
]
})
- +

Static Sites Rocket

This package is a configurable Booster rocket to add static site deployment to your Booster applications. It uploads your root.

Usage

Install this package as a dev dependency in your Booster project (It's a dev dependency because it's only used during deployment, but we don't want this code to be uploaded to the project lambdas)

npm install --save-dev @boostercloud/rocket-static-sites-aws-infrastructure

In your Booster config file, pass a RocketDescriptor in the config.rockets array to configuring the static site rocket:

import { Booster } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'

Booster.configure('development', (config: BoosterConfig): void => {
config.appName = 'my-store'
config.rockets = [
{
packageName: '@boostercloud/rocket-static-sites-aws-infrastructure',
parameters: {
bucketName: 'test-bucket-name', // Required
rootPath: './frontend/dist', // Defaults to ./public
indexFile: 'main.html', // File to render when users access the CLoudFormation URL. Defaults to index.html
errorFile: 'error.html', // File to render when there's an error. Defaults to 404.html
}
},
]
})
+ \ No newline at end of file diff --git a/going-deeper/rockets/rocket-webhook/index.html b/going-deeper/rockets/rocket-webhook/index.html index 7c2569ab2..cc8d57298 100644 --- a/going-deeper/rockets/rocket-webhook/index.html +++ b/going-deeper/rockets/rocket-webhook/index.html @@ -6,13 +6,13 @@ Webhook Rocket | Booster Framework - +
-

Webhook Rocket

This rocket adds a Webhook to your Booster application. When the webhook is called, a function will be executed in your Booster application with request as a parameter.

Supported Providers

  • Azure Provider
  • Local Provider

Usage

Add your rocket to your application in the Booster configuration file:

config.rockets = [buildBoosterWebhook(config).rocketForAzure()]

Then declare the function to initialize the BoosterWebhook:

function buildBoosterWebhook(config: BoosterConfig): BoosterWebhook {
return new BoosterWebhook(config, [
{
origin: 'test',
handlerClass: TestHandler,
},
{
origin: 'other',
handlerClass: FacebookHandler,
},
])
}
info

Parameters:

  • origin: Identify the webhook. It will be also the name of the endpoint that will be created.
  • handlerClass: A class with a handle method to handle the request.

The handle method should be like this one:

export class TestHandler {

constructor() {
}

public static async handle(webhookEventInterface: WebhookEvent, register: Register): Promise<WebhookHandlerReturnType> {
if (validationFails()) {
throw new InvalidParameterError("Error message");
}
return Promise.resolve({
body: { name: "my_name" }
});
}
}

Return type

Handle methods return a promise of WebhookHandlerReturnType or void. This object contains the headers and body to be returned as response.

Example:

  public static async handle(webhookEventInterface: WebhookEvent, register: Register): Promise<WebhookHandlerReturnType> {
return Promise.resolve({
body: 'ok',
headers: {
Test: 'test header',
},
})
}

Demo

curl --request POST 'http://localhost:3000/webhook/command?param1=testvalue'

The webhookEventInterface object will be similar to this one:

{
origin: 'test',
method: 'POST',
url: '/test?param1=testvalue',
originalUrl: '/webhook/test?param1=testvalue',
headers: {
accept: '*/*',
'cache-control': 'no-cache',
host: 'localhost:3000',
'accept-encoding': 'gzip, deflate, br',
connection: 'keep-alive',
'content-length': '0'
},
query: { param1: 'testvalue' },
params: {},
rawBody: undefined,
body: {}
}
- +

Webhook Rocket

This rocket adds a Webhook to your Booster application. When the webhook is called, a function will be executed in your Booster application with request as a parameter.

Supported Providers

  • Azure Provider
  • Local Provider

Usage

Add your rocket to your application in the Booster configuration file:

config.rockets = [buildBoosterWebhook(config).rocketForAzure()]

Then declare the function to initialize the BoosterWebhook:

function buildBoosterWebhook(config: BoosterConfig): BoosterWebhook {
return new BoosterWebhook(config, [
{
origin: 'test',
handlerClass: TestHandler,
},
{
origin: 'other',
handlerClass: FacebookHandler,
},
])
}
info

Parameters:

  • origin: Identify the webhook. It will be also the name of the endpoint that will be created.
  • handlerClass: A class with a handle method to handle the request.

The handle method should be like this one:

export class TestHandler {

constructor() {
}

public static async handle(webhookEventInterface: WebhookEvent, register: Register): Promise<WebhookHandlerReturnType> {
if (validationFails()) {
throw new InvalidParameterError("Error message");
}
return Promise.resolve({
body: { name: "my_name" }
});
}
}

Return type

Handle methods return a promise of WebhookHandlerReturnType or void. This object contains the headers and body to be returned as response.

Example:

  public static async handle(webhookEventInterface: WebhookEvent, register: Register): Promise<WebhookHandlerReturnType> {
return Promise.resolve({
body: 'ok',
headers: {
Test: 'test header',
},
})
}

Demo

curl --request POST 'http://localhost:3000/webhook/command?param1=testvalue'

The webhookEventInterface object will be similar to this one:

{
origin: 'test',
method: 'POST',
url: '/test?param1=testvalue',
originalUrl: '/webhook/test?param1=testvalue',
headers: {
accept: '*/*',
'cache-control': 'no-cache',
host: 'localhost:3000',
'accept-encoding': 'gzip, deflate, br',
connection: 'keep-alive',
'content-length': '0'
},
query: { param1: 'testvalue' },
params: {},
rawBody: undefined,
body: {}
}
+ \ No newline at end of file diff --git a/going-deeper/testing/index.html b/going-deeper/testing/index.html index e86e81f38..4fb6611f2 100644 --- a/going-deeper/testing/index.html +++ b/going-deeper/testing/index.html @@ -6,13 +6,13 @@ Testing | Booster Framework - +
-

Testing

Booster applications are fully tested by default. This means that you can be sure that your application will work as expected. However, you can also write your own tests to check that your application behaves as you expect. In this section, we will leave some recommendations on how to test your Booster application.

Testing Booster applications

To properly test a Booster application, you should create a test folder at the same level as the src one. Apart from that, tests' names should have the <my_test>.test.ts format.

When a Booster application is generated, you will have a script in a package.json like this:

"scripts": {
"test": "nyc --extension .ts mocha --forbid-only \"test/**/*.test.ts\""
}

The only thing that you should add to this line are the AWS_SDK_LOAD_CONFIG=true and BOOSTER_ENV=test environment variables, so the script will look like this:

"scripts": {
"test": "AWS_SDK_LOAD_CONFIG=true BOOSTER_ENV=test nyc --extension .ts mocha --forbid-only \"test/**/*.test.ts\""
}

Testing with sinon-chai

The BoosterConfig can be accessed through the Booster.config on any part of a Booster application. To properly mock it for your objective, we really recommend to use sinon replace method, after configuring your Booster.config as desired.

In the example below, we add 2 "empty" read-models, since we are iterating Booster.config.readModels from a command handler:

// Test
import { replace } from 'sinon'

const config = new BoosterConfig('test')
config.appName = 'testing-time'
config.providerPackage = '@boostercloud/framework-provider-aws'
config.readModels['WoW'] = {} as ReadModelMetadata
config.readModels['Amazing'] = {} as ReadModelMetadata
replace(Booster, 'config', config)

const spyMyCall = spy(MyCommand, 'myCall')
const command = new MyCommand('1', true)
const register = new Register('request-id-1')
const registerSpy = spy(register, 'events')
await MyCommand.handle(command, register)

expect(spyMyCall).to.have.been.calledOnceWithExactly('WoW')
expect(spyMyCall).to.have.been.calledOnceWithExactly('Amazing')
expect(registerSpy).to.have.been.calledOnceWithExactly(new MyEvent('1', 'WoW'))
expect(registerSpy).to.have.been.calledOnceWithExactly(new MyEvent('1', 'Amazing'))
// Example code
public static async handle(command: MyCommand, register: Register): Promise<void> {
const readModels = Booster.config.readModels
for (const readModelName in readModels) {
myCall(readModelName)
register.events(new MyEvent(command.ID, readModelName))
}
}

These are some files that might help you speed up your testing with Booster.

// <root_dir>/test/expect.ts
import * as chai from 'chai'

chai.use(require('sinon-chai'))
chai.use(require('chai-as-promised'))

export const expect = chai.expect

This expect method will help you with some more additional methods like expect(<my_stub>).to.have.been.calledOnceWithExactly(<my_params..>)

# <root_dir>/.mocharc.yml
diff: true
require: 'ts-node/register'
extension:
- ts
package: './package.json'
recursive: true
reporter: 'spec'
timeout: 5000
full-trace: true
bail: true

Framework integration tests

Booster framework integration tests package is used to test the Booster project itself, but it is also an example of how a Booster application could be tested. We encourage developers to have a look at our Booster project repository.

Some integration tests highly depend on the provider chosen for the project, and the infrastructure is normally deployed in the cloud right before the tests run. Once tests are completed, the application is teared down.

There are several types of integration tests in this package:

  • Tests to ensure that different packages integrate as expected with each other.
  • Tests to ensure that a Booster application behaves as expected when it is hit by a client (a GraphQL client).
  • Tests to ensure that the application behaves in the same way no matter what provider is selected.
- +

Testing

Booster applications are fully tested by default. This means that you can be sure that your application will work as expected. However, you can also write your own tests to check that your application behaves as you expect. In this section, we will leave some recommendations on how to test your Booster application.

Testing Booster applications

To properly test a Booster application, you should create a test folder at the same level as the src one. Apart from that, tests' names should have the <my_test>.test.ts format.

When a Booster application is generated, you will have a script in a package.json like this:

"scripts": {
"test": "nyc --extension .ts mocha --forbid-only \"test/**/*.test.ts\""
}

The only thing that you should add to this line are the AWS_SDK_LOAD_CONFIG=true and BOOSTER_ENV=test environment variables, so the script will look like this:

"scripts": {
"test": "AWS_SDK_LOAD_CONFIG=true BOOSTER_ENV=test nyc --extension .ts mocha --forbid-only \"test/**/*.test.ts\""
}

Testing with sinon-chai

The BoosterConfig can be accessed through the Booster.config on any part of a Booster application. To properly mock it for your objective, we really recommend to use sinon replace method, after configuring your Booster.config as desired.

In the example below, we add 2 "empty" read-models, since we are iterating Booster.config.readModels from a command handler:

// Test
import { replace } from 'sinon'

const config = new BoosterConfig('test')
config.appName = 'testing-time'
config.providerPackage = '@boostercloud/framework-provider-aws'
config.readModels['WoW'] = {} as ReadModelMetadata
config.readModels['Amazing'] = {} as ReadModelMetadata
replace(Booster, 'config', config)

const spyMyCall = spy(MyCommand, 'myCall')
const command = new MyCommand('1', true)
const register = new Register('request-id-1')
const registerSpy = spy(register, 'events')
await MyCommand.handle(command, register)

expect(spyMyCall).to.have.been.calledOnceWithExactly('WoW')
expect(spyMyCall).to.have.been.calledOnceWithExactly('Amazing')
expect(registerSpy).to.have.been.calledOnceWithExactly(new MyEvent('1', 'WoW'))
expect(registerSpy).to.have.been.calledOnceWithExactly(new MyEvent('1', 'Amazing'))
// Example code
public static async handle(command: MyCommand, register: Register): Promise<void> {
const readModels = Booster.config.readModels
for (const readModelName in readModels) {
myCall(readModelName)
register.events(new MyEvent(command.ID, readModelName))
}
}

These are some files that might help you speed up your testing with Booster.

// <root_dir>/test/expect.ts
import * as chai from 'chai'

chai.use(require('sinon-chai'))
chai.use(require('chai-as-promised'))

export const expect = chai.expect

This expect method will help you with some more additional methods like expect(<my_stub>).to.have.been.calledOnceWithExactly(<my_params..>)

# <root_dir>/.mocharc.yml
diff: true
require: 'ts-node/register'
extension:
- ts
package: './package.json'
recursive: true
reporter: 'spec'
timeout: 5000
full-trace: true
bail: true

Framework integration tests

Booster framework integration tests package is used to test the Booster project itself, but it is also an example of how a Booster application could be tested. We encourage developers to have a look at our Booster project repository.

Some integration tests highly depend on the provider chosen for the project, and the infrastructure is normally deployed in the cloud right before the tests run. Once tests are completed, the application is teared down.

There are several types of integration tests in this package:

  • Tests to ensure that different packages integrate as expected with each other.
  • Tests to ensure that a Booster application behaves as expected when it is hit by a client (a GraphQL client).
  • Tests to ensure that the application behaves in the same way no matter what provider is selected.
+ \ No newline at end of file diff --git a/going-deeper/touch-entities/index.html b/going-deeper/touch-entities/index.html index c6690b7fe..efa893777 100644 --- a/going-deeper/touch-entities/index.html +++ b/going-deeper/touch-entities/index.html @@ -6,7 +6,7 @@ TouchEntities | Booster Framework - + @@ -15,8 +15,8 @@ This functionality is useful when a new projection is added to a ReadModel and you want to apply it retroactively to the events that have already occurred. It is also helpful when there was an error when calculating a ReadModel or when the snapshot of an entity was not generated.

To migrate an existing entity to a new version, you need to call BoosterTouchEntityHandler.touchEntity to touch entities. For example, this command will touch all the entities of the class Cart.:

import { Booster, BoosterTouchEntityHandler, Command } from '@boostercloud/framework-core'
import { Register } from '@boostercloud/framework-types'
import { Cart } from '../entities/cart'

@Command({
authorize: 'all',
})
export class TouchCommand {
public constructor() {}

public static async handle(_command: TouchCommand, _register: Register): Promise<void> {
const entitiesIdsResult = await Booster.entitiesIDs('Cart', 500, undefined)
const paginatedEntityIdResults = entitiesIdsResult.items
const carts = await Promise.all(
paginatedEntityIdResults.map(async (entity) => await Booster.entity(Cart, entity.entityID))
)
if (!carts || carts.length === 0) {
return
}
await Promise.all(
carts.map(async (cart) => {
const validCart = cart!
await BoosterTouchEntityHandler.touchEntity('Cart', validCart.id)
console.log('Touched', validCart)
return validCart.id
})
)
}
}

Please note that touching entities is an advanced feature that should be used with caution and only when necessary. -It may affect your application performance and consistency if not used properly.

- +It may affect your application performance and consistency if not used properly.

+ \ No newline at end of file diff --git a/graphql/index.html b/graphql/index.html index 4a8b54105..b5fea9e3c 100644 --- a/graphql/index.html +++ b/graphql/index.html @@ -6,7 +6,7 @@ GraphQL API | Booster Framework - + @@ -18,8 +18,8 @@ data on-demand, and a subscription when you want to receive data at the moment it is updated.

Knowing this, you can infer the relationship between those operations and your Booster components:

  • You send a command using a mutation.
  • You read a read model using a query.
  • You subscribe to a read model using a subscription.

How to send GraphQL request

GraphQL uses two existing protocols:

  • HTTP for mutation and query operations.
  • WebSocket for subscription operations.

The reason for the WebSocket protocol is that, in order for subscriptions to work, there must be a way for the server to send data to clients when it is changed. HTTP doesn't allow that, as it is the client the one which always initiates the request.

So you should use the graphqlURL to send GraphQL queries and mutations, and the websocketURL to send subscriptions. You can see both URLs after deploying your application.

Therefore:

  • To send a GraphQL mutation/query, you send an HTTP request to "<graphqlURL>", with method POST, and a JSON-encoded body with the mutation/query details.
  • To send a GraphQL subscription, you first connect to the "<websocketURL>", and then send a JSON-encoded message with the subscription details, following the "GraphQL over WebSocket" protocol.
note

You can also send queries and mutations through the WebSocket if that's convenient to you. See "The GraphQL over WebSocket protocol" to know more.

While it is OK to know how to manually send GraphQL request, you normally don't need to deal with this low-level details, especially with the WebSocket stuff.

To have a great developer experience, we strongly recommend to use a GraphQL client for your platform of choice. Here are some great ones:

  • Altair: Ideal for testing sending manual requests, getting the schema, etc.
  • Apollo clients: These are the "go-to" SDKs to interact with a GraphQL API from your clients. It is very likely that there is a version for your client programming language. Check the "Using Apollo Client" section to know more about this.

Get GraphQL schema from deployed application

After deploying your application with the command boost deploy -e development, you can get your GraphQL schema by using a tool like Altair. The previous command displays multiple endpoints, one of them is graphqlURL, which has the following pattern:

https://<base_url>/<environment>/graphql

By entering this URL in Altair, the schema can be displayed as shown in the screenshot (You need to click on the Docs button in the URL bar). You can check the available Queries and Mutations by clicking on their name:

Altair queries Altair mutations

Sending commands

As mentioned in the previous section, we need to use a "mutation" to send a command. The structure of a mutation (the body of the request) is the following:

mutation {
command_name(input: {
input_field_list
})
}

Where:

  • command_name is the name of the class corresponding to the command you want to send
  • input_field_list is a list of pairs in the form of fieldName: fieldValue containing the data of your command. The field names correspond to the names of the properties you defined in the command class.

In the following example we send a command named "ChangeCart" that will add/remove an item to/from a shopping cart. The command requires the ID of the cart (cartId), the item identifier (sku) and the quantity of units we are adding/removing (quantity).

URL: "<graphqlURL>"
mutation {
ChangeCart(input: { cartId: "demo", sku: "ABC_01", quantity: 2 })
}

In case we are not using any GraphQL client, this would be the equivalent bare HTTP request:

URL: "<graphqlURL>"
METHOD: "POST"
{
"query": "mutation { ChangeCart(input: { cartId: \"demo\" sku: \"ABC_01\" quantity: 2 }) }"
}

And this would be the response:

{
"data": {
"ChangeCart": true
}
}
note

Remember to set the proper access token for secured commands, check "Authorizing operations".

Reading read models

To read a specific read model, we need to use a "query" operation. The structure of the "query" (the body -of the request) is the following:

query {
read_model_name(id: "<id of the read model>") {
selection_field_list
}
}

Where:

  • read_model_name is the name of the class corresponding to the read model you want to retrieve.
  • <id of the read model> is the ID of the specific read model instance you are interested in.
  • selection_field_list is a list with the names of the specific read model fields you want to get as response.

In the following example we send a query to read a read model named CartReadModel whose ID is demo. We get back its id and the list of cart items as response.

URL: "<graphqlURL>"
query {
CartReadModel(id: "demo") {
id
items
}
}

In case we are not using any GraphQL client, this would be the equivalent bare HTTP request:

URL: "<graphqlURL>"
METHOD: "POST"
{
"query": "query { CartReadModel(id: \"demo\") { id items } }"
}

And we would get the following as response:

{
"data": {
"CartReadModel": {
"id": "demo",
"items": [
{
"sku": "ABC_01",
"quantity": 2
}
]
}
}
}
note

Remember to set the proper access token for secured read models, check "Authorizing operations".

Subscribing to read models

To subscribe to a specific read model, we need to use a subscription operation, and it must be sent through the websocketURL using the GraphQL over WebSocket protocol.

Doing this process manually is a bit cumbersome. You will probably never need to do this, as GraphQL clients like Apollo abstract this process away. However, we will explain how to do it for learning purposes.

Before sending any subscription, you need to connect to the WebSocket to open the two-way communication channel. This connection is done differently depending on the client/library you use to manage web sockets. In this section, we will show examples using the wscat command line program. You can also use the online tool Altair

Once you have connected successfully, you can use this channel to:

  • Send the subscription messages.
  • Listen for messages sent by the server with data corresponding to your active subscriptions.

The structure of the "subscription" (the body of the message) is exactly the same as the "query" operation:

subscription {
read_model_name(id: "<id of the read model>") {
selection_field_list
}
}

Where:

  • read_model_name is the name of the class corresponding to the read model you want to subscribe to.
  • <id of the read model> is the ID of the specific read model instance you are interested in.
  • selection_field_list is a list with the names of the specific read model fields you want to get when data is sent back to you.

In the following examples we use wscat to connect to the web socket. After that, we send the required messages to conform the GraphQL over WebSocket protocol, including the subscription operation to the read model CartReadModel with ID demo.

  1. Connect to the web socket:
 wscat -c <websocketURL> -s graphql-ws
note

You should specify the graphql-ws subprotocol when connecting with your client via the Sec-WebSocket-Protocol header (in this case, wscat does that when you use the -s option).

Now we can start sending messages just by writing them and hitting the Enter key.

  1. Initiate the protocol connection :
{ "type": "connection_init" }

In case you want to authorize the connection, you need to send the authorization token in the payload.Authorization field:

{ "type": "connection_init", "payload": { "Authorization": "<your token>" } }
  1. Send a message with the subscription. We need to provide an ID for the operation. When the server sends us data back, it will include this same ID so that we know which subscription the received data belongs to (again, this is just for learning, GraphQL clients manages this for you)
{ "id": "1", "type": "start", "payload": { "query": "subscription { CartReadModel(id:\"demo\") { id items } }" } }

After a successful subscription, you won't receive anything in return. Now, every time the read model you subscribed to is modified, a new incoming message will appear in the socket with the updated version of the read model. This message will have exactly the same format as if you were done a query with the same parameters.

Following with the previous example, we now send a command (using a mutation operation) that adds a new item with sku "ABC_02" to the CartReadModel. After it has been added, we receive the updated version of the read model through the socket.

  1. Send the following command (this time using an HTTP request):
URL: "<graphqlURL>"
mutation {
ChangeCart(input: { cartId: "demo", sku: "ABC_02", quantity: 3 })
}
  1. The following message (after formatting it) appears through the socket connection we had opened:
{
"id": "1",
"type": "data",
"payload": {
"data": {
"CartReadModel": {
"id": "demo",
"items": [
{
"sku": "ABC_01",
"quantity": 2
},
{
"sku": "ABC_02",
"quantity": 3
}
]
}
}
}
}
note

Remember that, in case you want to subscribe to a read model that is restricted to a specific set of roles, you must send the access token retrieved upon sign-in. Check "Authorizing operations" to know how to do this.

note

You can disable the creation of all the infrastructure and functionality needed to manage subscriptions by setting config.enableSubscriptions=false in your Booster.config block

Non exposing properties and parameters

By default, all properties and parameters of the command constructor and/or read model are accessible through GraphQL. It is possible to not expose any of them adding the @NonExposed annotation to the constructor property or parameter.

Example

@ReadModel({
authorize: 'all',
})
export class CartReadModel {
@NonExposed
private internalProperty: number

public constructor(
readonly id: UUID,
readonly cartItems: Array<CartItem>,
readonly checks: number,
public shippingAddress?: Address,
public payment?: Payment,
public cartItemsIds?: Array<string>,
@NonExposed readonly internalParameter?: number
) {
...
}

...
}

Adding before hooks to your read models

When you send queries or subscriptions to your read models, you can tell Booster to execute some code before executing the operation. These are called before hooks, and they receive a ReadModelRequestEnvelope object representing the current request.

interface ReadModelRequestEnvelope<TReadModel> {
currentUser?: UserEnvelope // The current authenticated user
requestID: UUID // An ID assigned to this request
key?: { // If present, contains the id and sequenceKey that identify a specific read model
id: UUID
sequenceKey?: SequenceKey
}
className: string // The read model class name
filters: ReadModelRequestProperties<TReadModel> // Filters set in the GraphQL query
limit?: number // Query limit if set
afterCursor?: unknown // For paginated requests, id to start reading from
}

In before hooks, you can either abort the request or alter and return the request object to change the behavior of your request. Before hooks are useful for many use cases, but they're especially useful to add fine-grained access control. For example, to enforce a filter that restrict a logged in user to access only read models objects they own.

When a before hook throws an exception, the request is aborted and the error is sent back to the user. In order to continue with the request, it's required that the request object is returned.

In order to define a before hook you pass a list of functions with the right signature to the read model decorator before parameter:

@ReadModel({
authorize: [User],
before: [validateUser],
})
export class CartReadModel {
public constructor(
readonly id: UUID,
readonly userId: UUID
) {}
// Your projections go here
}

function validateUser(request: ReadModelRequestEnvelope<CartReadModel>): ReadModelRequestEnvelope<CartReadModel> {
if (request.filters?.userId?.eq !== request.currentUser?.id) throw NotAuthorizedError("...")
return request
}

You can also define more than one before hook for a read model, and they will be chained, sending the resulting request object from a hook to the next one.

note

The order in which filters are specified matters.

import { changeFilters } from '../../filters-helper' // You can also use external functions!

@ReadModel({
authorize: [User],
before: [validateUser, validateEmail, changeFilters],
})
export class CartReadModel {
public constructor(
readonly id: UUID,
readonly userId: UUID
) {}

// Your projections go here
}

function validateUser(request: ReadModelRequestEnvelope<CartReadModel>): ReadModelRequestEnvelope<CartReadModel> {
if (request.filters?.userId?.eq !== request.currentUser?.id) throw NotAuthorizedError("...")
return request
}

function validateEmail(request: ReadModelRequestEnvelope<CartReadModel>): ReadModelRequestEnvelope<CartReadModel> {
if (!request.filters.email.includes('myCompanyDomain.com')) throw NotAuthorizedError("...")
return request
}

Adding before hooks to your commands

You can use before hooks also in your command handlers, and they work as the Read Models ones, with a slight difference: we don't modify filters but inputs (the parameters sent with a command). Apart from that, it's pretty much the same, here's an example:

@Command({
authorize: [User],
before: [beforeFn],
})
export class ChangeCartItem {
public constructor(readonly cartId: UUID, readonly productId: UUID, readonly quantity: number) {
}
}

function beforeFn(input: CommandInput, currentUser?: UserEnvelope): CommandInput {
if (input.cartUserId !== currentUser.id) {
throw NonAuthorizedUserException() // We don't let this user to trigger the command
}
return input
}

As you can see, we just check if the cartUserId is equal to the currentUser.id, which is the user id extracted from the auth token. This way, we can throw an exception and avoid this user to call this command.

Adding before hooks to your queries

You can use before hooks also in your queries, and they work as the Read Models ones, with a slight difference: we don't modify filters but inputs (the parameters sent with a query). Apart from that, it's pretty much the same, here's an example:

@Query({
authorize: 'all',
before: [CartTotalQuantity.beforeFn],
})
export class CartTotalQuantity {
public constructor(readonly cartId: UUID, @NonExposed readonly multiply: number) {}

public static async beforeFn(input: QueryInput, currentUser?: UserEnvelope): Promise<QueryInput> {
input.multiply = 100
return input
}
}

Reading events

You can also fetch events directly if you need. To do so, there are two kind of queries that have the following structure:

query {
eventsByEntity(entity: <name of entity>, entityID: "<id of the entity>") {
selection_field_list
}
}

query {
eventsByType(type: <name of event>) {
selection_field_list
}
}

Where:

  • <name of your entity> is the name of the class corresponding to the entity whose events you want to retrieve.
  • <id of the entity> is the ID of the specific entity instance whose events you are interested in. This is optional
  • <name of event> is the name of the class corresponding to the event type whose instances you want to retrieve.
  • selection_field_list is a list with the names of the specific fields you want to get as response. See the response example below to know more.

Examples

  URL: "<graphqlURL>"

A) Read all events associated with a specific instance (a specific ID) of the entity Cart

query {
eventsByEntity(entity: Cart, entityID: "ABC123") {
type
entity
entityID
requestID
createdAt
value
}
}

B) Read all events associated with any instance of the entity Cart

query {
eventsByEntity(entity: Cart) {
type
entity
entityID
requestID
createdAt
value
}
}

For these cases, you would get an array of event envelopes as a response. This means that you get some metadata related to the event along with the event content, which can be found inside the "value" field.

The response look like this:

{
"data": {
"eventsByEntity": [
{
"type": "CartItemChanged",
"entity": "Cart",
"entityID": "ABC123",
"requestID": "7a9cc6a7-7c7f-4ef0-aef1-b226ae4d94fa",
"createdAt": "2021-05-12T08:41:13.792Z",
"value": {
"productId": "73f7818c-f83e-4482-be49-339c004b6fdf",
"cartId": "ABC123",
"quantity": 2
}
}
]
}
}

C) Read events of a specific type

query {
eventsByType(type: CartItemChanged) {
type
entity
entityID
requestID
createdAt
value
}
}

The response would have the same structure as seen in the previous examples. The only difference is that this time you will get only the events with the type you have specified ("CartItemChanged")

Time filters

Optionally, for any of the previous queries, you can include a from and/or to time filters to get only those events that happened inside that time range. You must use a string with a time in ISO format with any precision you like, for example:

  • from:"2021" : Events created on 2021 year or up.
  • from:"2021-02-12" to:"2021-02-13" : Events created during February 12th.
  • from:"2021-03-16T16:16:25.178" : Events created at that date and time, using millisecond precision, or later.

Time filters examples

A) Cart events from February 23rd to July 20th, 2021

query {
eventsByEntity(entity: Cart, from: "2021-02-23", to: "2021-07-20") {
type
entity
entityID
requestID
createdAt
value
}
}

B) CartItemChanged events from February 25th to February 28th, 2021

query {
eventsByType(type: CartItemChanged, from: "2021-02-25", to: "2021-02-28") {
type
entity
entityID
requestID
createdAt
value
}
}

Known limitations

  • Subscriptions don't work for the events API yet
  • You can only query events, but not write them through this API. Use a command for that.
  • Currently, only available on the AWS provider.

Filter & Pagination

Filtering a read model

The Booster GraphQL API provides support for filtering Read Models on queries and subscriptions.

Using the GraphQL API endpoint you can retrieve the schema of your application so you can see what are the filters for every Read Model and its properties. You can filter like this:

Searching for a specific Read Model by id

query {
ProductReadModels(filter: { id: { eq: "test-id" } }) {
id
sku
availability
price
}
}

Supported filters

The currently supported filters are the following ones:

Boolean filters

FilterValueDescription
eqtrue/falseEqual to
netrue/falseNot equal to

Example:

query {
ProductReadModels(filter: { availability: { eq: true } }) {
id
sku
availability
price
}
}

Number filters

FilterValueDescription
eqFloatEqual to
neFloatNot equal to
gtFloatGreater than
gteFloatGreater or equal than
ltFloatLower than
lteFloatLower or equal than
in[Float]Exists in given array

Example:

query {
ProductReadModels(filter: { price: { gt: 200 } }) {
id
sku
availability
price
}
}

String filters

FilterValueDescription
eqStringEqual to
neStringNot equal to
gtStringGreater than
gteStringGreater or equal than
ltStringLower than
lteStringLower or equal than
in[String]Exists in given array
beginsWithStringStarts with a given substr
containsStringContains a given substr
regex*StringRegular expression
iRegex*StringCase insensitive Regular expression

NOTE:

note

regex and iRegex are supported by Azure and Local Provider only

Example:

query {
ProductReadModels(filter: { sku: { begingsWith: "jewelry" } }) {
id
sku
availability
price
}
}
note

eq and ne are valid filters for checking if a field value is null or not null.

Array filters

FilterValueDescription
includesObjectIncludes a given object

Example:

query {
CartReadModels(filter: { itemsIds: { includes: "test-item" } }) {
id
price
itemsIds
}
}
note

Right now, with complex properties in Arrays, you just can filter them if you know the exact value of an element but is not possible to filter from a property of the element. As a workaround, you can use an array of ids of the complex property and filter for that property as in the example above.

Filter combinators

All the filters can be combined to create a more complex search on the same properties of the ReadModel.

FilterValueDescription
and[Filters]AND - all filters on the list have a match
or[Filters]OR - At least one filter of the list has a match
notFilter/and/orThe element does not match the filter

Example:

query {
CartReadModels(filter: { or: [{ id: { contains: "a" } }, { id: { contains: "b" } }] }) {
id
price
itemsIds
}
}

IsDefined operator

FilterValueDescription
isDefinedtrue/falsefield exists or not

Example:

query {
CartReadModels(filter: { price: { isDefined: true } }) {
id
price
itemsIds
}
}

Getting and filtering read models data at code level

Booster allows you to get your read models data in your commands handlers and event handlers using the Booster.readModel method.

For example, you can filter and get the total number of the products that meet your criteria in your commands like this:

@Command({
authorize: 'all',
})
export class GetProductsCount {
public constructor(readonly filters: Record<string, any>) {}

public static async handle(): Promise<void> {
const searcher = Booster.readModel(ProductReadModel)

searcher.filter({
sku: { contains: 'toy' },
or: [
{
description: { contains: 'fancy' },
},
{
description: { contains: 'great' },
},
],
})

const result = await searcher.search()
return { count: result.length }
}
}

Warning: Notice that ReadModels are eventually consistent objects that are calculated as all events in all entities that affect the read model are settled. You should not assume that a read model is a proper source of truth, so you shouldn't use this feature for data validations. If you need to query the most up-to-date current state, consider fetching your Entities, instead of ReadModels, with Booster.entity

Using sorting

Booster allows you to sort your read models data in your commands handlers and event handlers using the Booster.readModel method.

For example, you can sort and get the products in your commands like this:

{
ListCartReadModels(filter: {}, limit: 5, sortBy: {
shippingAddress: {
firstName: ASC
}
}) {
items {
id
cartItems
checks
shippingAddress {
firstName
}
payment {
cartId
}
cartItemsIds
}
cursor
}
}

This is a preview feature available only for some Providers and with some limitations:

  • Azure:
    • Sort by one field supported.
    • Nested fields supported.
    • Sort by more than one file: unsupported.
  • Local:
    • Sort by one field supported.
    • Nested fields supported.
    • Sort by more than one file: unsupported.

Warning: It is not possible to sort by fields defined as Interface, only classes or primitives types.

Using pagination

The Booster GraphQL API includes a type for your read models that stands for List{"your-read-model-name"}, which is the official way to work with pagination. Alternative, there is another type without the List prefix, which will be deprecated in future versions.

The Read Model List type includes some new parameters that can be used on queries:

  • limit; an integer that specifies the maximum number of items to be returned.

  • afterCursor; a parameter to set the cursor property returned by the previous query, if not null.

    Example:

query {
ListProductReadModels
(
limit: 1,
afterCursor: { id: "last-page-item"}
) {
id
sku
availability
price
}
}

Besides the parameters, this type also returns a type {your-read-model-name}Connection, it includes the following properties:

  • cursor; if there are more results to paginate, it will return the object to pass to the afterCursor parameter on the next query. If there aren't more items to be shown, it will be undefined.
  • items; the list of items returned by the query, if there aren't any, it will be an empty list.

Using Apollo Client

One of the best clients to connect to a GraphQL API is the Apollo client. There will probably be a version for your client technology of choice. These are the main ones:

We recommend referring to the documentation of those clients to know how to use them. Here is an example of how to fully instantiate the Javascript client so that it works for queries, mutations and subscriptions:

import { split, HttpLink } from '@apollo/client'
import { getMainDefinition } from '@apollo/client/utilities'
import { WebSocketLink } from '@apollo/client/link/ws'
import { ApolloClient, InMemoryCache } from '@apollo/client'
import { SubscriptionClient } from 'subscriptions-transport-ws'

// Helper function that checks if a GraphQL operation is a subscription or not
function isSubscriptionOperation({ query }) {
const definition = getMainDefinition(query)
return definition.kind === 'OperationDefinition' && definition.operation === 'subscription'
}

// Create an HTTP link for sending queries and mutations
const httpLink = new HttpLink({
uri: '<graphqlURL>',
})

// Create a SusbscriptionClient and a WebSocket link for sending subscriptions
const subscriptionClient = new SubscriptionClient('<websocketURL>', {
reconnect: true,
})
const wsLink = new WebSocketLink(subscriptionClient)

// Combine both links so that depending on the operation, it uses one or another
const splitLink = split(isSubscriptionOperation, wsLink, httpLink)

// Finally, create the client using the link created above
const client = new ApolloClient({
link: splitLink,
cache: new InMemoryCache(),
})

Now, we can send queries, mutations and subscriptions using the client instance:

import gql from 'graphql-tag'

// Query the CartReadModel
const readModelData = await client.query({
variables: {
cartID: 'demo',
},
query: gql`
query QueryCart($cartID: ID!) {
CartReadModel(id: $cartID) {
id
items
}
}
`,
})

// Send a command (mutation)
const commandResult = await client.mutate({
variables: {
cartID: 'demo',
sku: 'ABC_02',
},
mutation: gql`
mutation AddOneItemToCart($cartID: ID!, $sku: string!) {
ChangeCart(input: { cartId: $cartID, sku: $sku, quantity: 1 })
}
`,
})

// Subscribe to changes in the CartReadModel
const subscriptionOperation = client.subscribe({
variables: {
cartID: 'demo',
},
query: gql`
subscription SubscribeToCart($cartID: ID!) {
CartReadModel(id: $cartID) {
id
cartItems
}
}
`,
})

subscriptionOperation.subscribe({
next: (cartReadModel) => {
// This function is called everytime the CartReadModel with ID="demo" is changed
// Parameter "cartReadModel" contains the latest version of the cart
},
})

Authorizing operations

When you have a command or read model whose access is authorized to users with a specific set of roles (see Authentication and Authorization), you need to use an authorization token to send queries, mutations or subscriptions to that command or read model.

You can use Authentication Rocket to authorize operations, see its documentation and, more especifically, the Sign in section to know how to get a token. Once you have a token, the way to send it varies depending on the protocol you are using to send GraphQL operations:

  • For HTTP, you need to send the HTTP header Authorization with the token, making sure you prefix it with Bearer (the kind of token Booster uses). For example:
Authorization: Bearer <your token>
{ "type": "connection_init", "payload": { "Authorization": "<your token>" } }

You normally won't be sending tokens in such a low-level way. GraphQL clients have easier ways to send these tokens. See Sending tokens with Apollo client

Sending tokens with Apollo clients

We recommend going to the specific documentation of the specific Apollo client you are using to know how to send tokens. However, the basics of this guide remains the same. Here is an example of how you would configure the Javascript/Typescript Apollo client to send the authorization token. The example is exactly the same as the one shown in the Using Apollo clients section, but with the changes needed to send the token. Notice that <AuthApiEndpoint> and <idToken> are obtained from the Authentication Rocket.

import { split, HttpLink, ApolloLink } from '@apollo/client'
import { getMainDefinition } from '@apollo/client/utilities'
import { WebSocketLink } from '@apollo/client/link/ws'
import { ApolloClient, InMemoryCache } from '@apollo/client'
import { SubscriptionClient } from 'subscriptions-transport-ws'

function isSubscriptionOperation({ query }) {
const definition = getMainDefinition(query)
return definition.kind === 'OperationDefinition' && definition.operation === 'subscription'
}

// CHANGED: We now use the AuthApiEndpoint obtained by the auth rocket
const httpLink = new HttpLink({
uri: '<graphqlURL>',
})

// CHANGED: We create an "authLink" that modifies the operation by adding the token to the headers
const authLink = new ApolloLink((operation, forward) => {
operation.setContext({
headers: {
Authorization: 'Bearer <idToken>',
},
})
return forward(operation)
})

// <-- CHANGED: Concatenate the links so that the "httpLink" receives the operation with the headers set by the "authLink"
const httpLinkWithAuth = authLink.concat(httpLink)

const subscriptionClient = new SubscriptionClient('<websocketURL>', {
reconnect: true,
// CHANGED: added a "connectionParam" property with a function that returns the `Authorizaiton` header containing our token
connectionParams: () => {
return {
Authorization: 'Bearer <idToken>',
}
},
})
const wsLink = new WebSocketLink(subscriptionClient)

const splitLink = split(isSubscriptionOperation, wsLink, httpLinkWithAuth) // Note that we now are using "httpLinkWithAuth"

const client = new ApolloClient({
link: splitLink,
cache: new InMemoryCache(),
})

Refreshing tokens with Apollo clients

Authorization tokens expire after a certain amount of time. When a token is expired, you will get an error and you will need to call the refresh the token endpoint to get a new token. After you have done so, you need to use the new token in your GraphQL operations.

There are several ways to do this. Here we show the simplest one for learning purposes.

First, we modify the example shown in the section Sending tokens with apollo clients so that the token is stored in a global variable and the Apollo links get the token from it. That variable will be updated when the user signs-in and the token is refreshed:

import { split, HttpLink, ApolloLink } from '@apollo/client'
import { getMainDefinition } from '@apollo/client/utilities'
import { WebSocketLink } from '@apollo/client/link/ws'
import { ApolloClient, InMemoryCache } from '@apollo/client'
import { SubscriptionClient } from 'subscriptions-transport-ws'

let authToken = undefined // <-- CHANGED: This variable will hold the token and will be updated everytime the token is refreshed

function isSubscriptionOperation({ query }) {
const definition = getMainDefinition(query)
return definition.kind === 'OperationDefinition' && definition.operation === 'subscription'
}

const httpLink = new HttpLink({
uri: '<AuthApiEndpoint>',
})

const authLink = new ApolloLink((operation, forward) => {
if (authToken) {
operation.setContext({
headers: {
Authorization: `Bearer ${authToken}`, // <-- CHANGED: We use the "authToken" global variable
},
})
}
return forward(operation)
})

const httpLinkWithAuth = authLink.concat(httpLink)

const subscriptionClient = new SubscriptionClient('<websocketURL>', {
reconnect: true,
// CHANGED: added a "connectionParam" property with a function that returns the `Authorizaiton` header containing our token
connectionParams: () => {
if (authToken) {
return {
Authorization: `Bearer ${authToken}`, // <-- CHANGED: We use the "authToken" global variable
}
}
return {}
},
})
const wsLink = new WebSocketLink(subscriptionClient)

const splitLink = split(isSubscriptionOperation, wsLink, httpLinkWithAuth)

const client = new ApolloClient({
link: splitLink,
cache: new InMemoryCache(),
})

Now, when the user signs-in or when the token is refreshed, we need to do two things:

  1. Update the global variable authToken with the new token.
  2. Reconnect the socket used by the subscription client by doing subscriptionClient.close(false).

You might be wondering why we need to do the second step. The reason is that, with operations sent through HTTP, the token goes along with every operation, in the headers. However, with operations sent through WebSockets, like subscriptions, the token is only sent when the socket connection is established. For this reason, everytime we update the token we need to reconnect the SubscriptionClient so that it sends again the token (the updated one in this case).

The GraphQL over WebSocket protocol

Sockets are channels for two-way communication that doesn't follow the request-response cycle, a characteristic feature of the HTTP protocol. One part can send many messages and the other part can receive all of them but only answer to some specific ones. What is more, messages could come in any order. For example, one part can send two messages and receive the response of the second message before the response of the first message.

For these reasons, in order to have an effective non-trivial communication through sockets, a sub-protocol is needed. It would be in charge of making both parts understand each other, share authentication tokens, matching response to the corresponding requests, etc.

The Booster WebSocket communication uses the "GraphQL over WebSocket" protocol as subprotocol. It is in charge of all the low level stuff needed to properly send subscription operations to read models and receive the corresponding data.

You don't need to know anything about this to develop using Booster, neither in the backend side nor in the frontend side (as all the Apollo GraphQL clients uses this protocol), but it is good to know it is there to guarantee a proper communication. In case you are really curious, you can read about the protocol here.

note

The WebSocket communication in Booster only supports this subprotocol, whose identifier is graphql-ws. For this reason, when you connect to the WebSocket provisioned by Booster, you must specify the graphql-ws subprotocol. If not, the connection won't succeed.

- +of the request) is the following:

query {
read_model_name(id: "<id of the read model>") {
selection_field_list
}
}

Where:

  • read_model_name is the name of the class corresponding to the read model you want to retrieve.
  • <id of the read model> is the ID of the specific read model instance you are interested in.
  • selection_field_list is a list with the names of the specific read model fields you want to get as response.

In the following example we send a query to read a read model named CartReadModel whose ID is demo. We get back its id and the list of cart items as response.

URL: "<graphqlURL>"
query {
CartReadModel(id: "demo") {
id
items
}
}

In case we are not using any GraphQL client, this would be the equivalent bare HTTP request:

URL: "<graphqlURL>"
METHOD: "POST"
{
"query": "query { CartReadModel(id: \"demo\") { id items } }"
}

And we would get the following as response:

{
"data": {
"CartReadModel": {
"id": "demo",
"items": [
{
"sku": "ABC_01",
"quantity": 2
}
]
}
}
}
note

Remember to set the proper access token for secured read models, check "Authorizing operations".

Subscribing to read models

To subscribe to a specific read model, we need to use a subscription operation, and it must be sent through the websocketURL using the GraphQL over WebSocket protocol.

Doing this process manually is a bit cumbersome. You will probably never need to do this, as GraphQL clients like Apollo abstract this process away. However, we will explain how to do it for learning purposes.

Before sending any subscription, you need to connect to the WebSocket to open the two-way communication channel. This connection is done differently depending on the client/library you use to manage web sockets. In this section, we will show examples using the wscat command line program. You can also use the online tool Altair

Once you have connected successfully, you can use this channel to:

  • Send the subscription messages.
  • Listen for messages sent by the server with data corresponding to your active subscriptions.

The structure of the "subscription" (the body of the message) is exactly the same as the "query" operation:

subscription {
read_model_name(id: "<id of the read model>") {
selection_field_list
}
}

Where:

  • read_model_name is the name of the class corresponding to the read model you want to subscribe to.
  • <id of the read model> is the ID of the specific read model instance you are interested in.
  • selection_field_list is a list with the names of the specific read model fields you want to get when data is sent back to you.

In the following examples we use wscat to connect to the web socket. After that, we send the required messages to conform the GraphQL over WebSocket protocol, including the subscription operation to the read model CartReadModel with ID demo.

  1. Connect to the web socket:
 wscat -c <websocketURL> -s graphql-ws
note

You should specify the graphql-ws subprotocol when connecting with your client via the Sec-WebSocket-Protocol header (in this case, wscat does that when you use the -s option).

Now we can start sending messages just by writing them and hitting the Enter key.

  1. Initiate the protocol connection :
{ "type": "connection_init" }

In case you want to authorize the connection, you need to send the authorization token in the payload.Authorization field:

{ "type": "connection_init", "payload": { "Authorization": "<your token>" } }
  1. Send a message with the subscription. We need to provide an ID for the operation. When the server sends us data back, it will include this same ID so that we know which subscription the received data belongs to (again, this is just for learning, GraphQL clients manages this for you)
{ "id": "1", "type": "start", "payload": { "query": "subscription { CartReadModel(id:\"demo\") { id items } }" } }

After a successful subscription, you won't receive anything in return. Now, every time the read model you subscribed to is modified, a new incoming message will appear in the socket with the updated version of the read model. This message will have exactly the same format as if you were done a query with the same parameters.

Following with the previous example, we now send a command (using a mutation operation) that adds a new item with sku "ABC_02" to the CartReadModel. After it has been added, we receive the updated version of the read model through the socket.

  1. Send the following command (this time using an HTTP request):
URL: "<graphqlURL>"
mutation {
ChangeCart(input: { cartId: "demo", sku: "ABC_02", quantity: 3 })
}
  1. The following message (after formatting it) appears through the socket connection we had opened:
{
"id": "1",
"type": "data",
"payload": {
"data": {
"CartReadModel": {
"id": "demo",
"items": [
{
"sku": "ABC_01",
"quantity": 2
},
{
"sku": "ABC_02",
"quantity": 3
}
]
}
}
}
}
note

Remember that, in case you want to subscribe to a read model that is restricted to a specific set of roles, you must send the access token retrieved upon sign-in. Check "Authorizing operations" to know how to do this.

note

You can disable the creation of all the infrastructure and functionality needed to manage subscriptions by setting config.enableSubscriptions=false in your Booster.config block

Non exposing properties and parameters

By default, all properties and parameters of the command constructor and/or read model are accessible through GraphQL. It is possible to not expose any of them adding the @NonExposed annotation to the constructor property or parameter.

Example

@ReadModel({
authorize: 'all',
})
export class CartReadModel {
@NonExposed
private internalProperty: number

public constructor(
readonly id: UUID,
readonly cartItems: Array<CartItem>,
readonly checks: number,
public shippingAddress?: Address,
public payment?: Payment,
public cartItemsIds?: Array<string>,
@NonExposed readonly internalParameter?: number
) {
...
}

...
}

Adding before hooks to your read models

When you send queries or subscriptions to your read models, you can tell Booster to execute some code before executing the operation. These are called before hooks, and they receive a ReadModelRequestEnvelope object representing the current request.

interface ReadModelRequestEnvelope<TReadModel> {
currentUser?: UserEnvelope // The current authenticated user
requestID: UUID // An ID assigned to this request
key?: { // If present, contains the id and sequenceKey that identify a specific read model
id: UUID
sequenceKey?: SequenceKey
}
className: string // The read model class name
filters: ReadModelRequestProperties<TReadModel> // Filters set in the GraphQL query
limit?: number // Query limit if set
afterCursor?: unknown // For paginated requests, id to start reading from
}

In before hooks, you can either abort the request or alter and return the request object to change the behavior of your request. Before hooks are useful for many use cases, but they're especially useful to add fine-grained access control. For example, to enforce a filter that restrict a logged in user to access only read models objects they own.

When a before hook throws an exception, the request is aborted and the error is sent back to the user. In order to continue with the request, it's required that the request object is returned.

In order to define a before hook you pass a list of functions with the right signature to the read model decorator before parameter:

@ReadModel({
authorize: [User],
before: [validateUser],
})
export class CartReadModel {
public constructor(
readonly id: UUID,
readonly userId: UUID
) {}
// Your projections go here
}

function validateUser(request: ReadModelRequestEnvelope<CartReadModel>): ReadModelRequestEnvelope<CartReadModel> {
if (request.filters?.userId?.eq !== request.currentUser?.id) throw NotAuthorizedError("...")
return request
}

You can also define more than one before hook for a read model, and they will be chained, sending the resulting request object from a hook to the next one.

note

The order in which filters are specified matters.

import { changeFilters } from '../../filters-helper' // You can also use external functions!

@ReadModel({
authorize: [User],
before: [validateUser, validateEmail, changeFilters],
})
export class CartReadModel {
public constructor(
readonly id: UUID,
readonly userId: UUID
) {}

// Your projections go here
}

function validateUser(request: ReadModelRequestEnvelope<CartReadModel>): ReadModelRequestEnvelope<CartReadModel> {
if (request.filters?.userId?.eq !== request.currentUser?.id) throw NotAuthorizedError("...")
return request
}

function validateEmail(request: ReadModelRequestEnvelope<CartReadModel>): ReadModelRequestEnvelope<CartReadModel> {
if (!request.filters.email.includes('myCompanyDomain.com')) throw NotAuthorizedError("...")
return request
}

Adding before hooks to your commands

You can use before hooks also in your command handlers, and they work as the Read Models ones, with a slight difference: we don't modify filters but inputs (the parameters sent with a command). Apart from that, it's pretty much the same, here's an example:

@Command({
authorize: [User],
before: [beforeFn],
})
export class ChangeCartItem {
public constructor(readonly cartId: UUID, readonly productId: UUID, readonly quantity: number) {
}
}

function beforeFn(input: CommandInput, currentUser?: UserEnvelope): CommandInput {
if (input.cartUserId !== currentUser.id) {
throw NonAuthorizedUserException() // We don't let this user to trigger the command
}
return input
}

As you can see, we just check if the cartUserId is equal to the currentUser.id, which is the user id extracted from the auth token. This way, we can throw an exception and avoid this user to call this command.

Adding before hooks to your queries

You can use before hooks also in your queries, and they work as the Read Models ones, with a slight difference: we don't modify filters but inputs (the parameters sent with a query). Apart from that, it's pretty much the same, here's an example:

@Query({
authorize: 'all',
before: [CartTotalQuantity.beforeFn],
})
export class CartTotalQuantity {
public constructor(readonly cartId: UUID, @NonExposed readonly multiply: number) {}

public static async beforeFn(input: QueryInput, currentUser?: UserEnvelope): Promise<QueryInput> {
input.multiply = 100
return input
}
}

Reading events

You can also fetch events directly if you need. To do so, there are two kind of queries that have the following structure:

query {
eventsByEntity(entity: <name of entity>, entityID: "<id of the entity>") {
selection_field_list
}
}

query {
eventsByType(type: <name of event>) {
selection_field_list
}
}

Where:

  • <name of your entity> is the name of the class corresponding to the entity whose events you want to retrieve.
  • <id of the entity> is the ID of the specific entity instance whose events you are interested in. This is optional
  • <name of event> is the name of the class corresponding to the event type whose instances you want to retrieve.
  • selection_field_list is a list with the names of the specific fields you want to get as response. See the response example below to know more.

Examples

  URL: "<graphqlURL>"

A) Read all events associated with a specific instance (a specific ID) of the entity Cart

query {
eventsByEntity(entity: Cart, entityID: "ABC123") {
type
entity
entityID
requestID
createdAt
value
}
}

B) Read all events associated with any instance of the entity Cart

query {
eventsByEntity(entity: Cart) {
type
entity
entityID
requestID
createdAt
value
}
}

For these cases, you would get an array of event envelopes as a response. This means that you get some metadata related to the event along with the event content, which can be found inside the "value" field.

The response look like this:

{
"data": {
"eventsByEntity": [
{
"type": "CartItemChanged",
"entity": "Cart",
"entityID": "ABC123",
"requestID": "7a9cc6a7-7c7f-4ef0-aef1-b226ae4d94fa",
"createdAt": "2021-05-12T08:41:13.792Z",
"value": {
"productId": "73f7818c-f83e-4482-be49-339c004b6fdf",
"cartId": "ABC123",
"quantity": 2
}
}
]
}
}

C) Read events of a specific type

query {
eventsByType(type: CartItemChanged) {
type
entity
entityID
requestID
createdAt
value
}
}

The response would have the same structure as seen in the previous examples. The only difference is that this time you will get only the events with the type you have specified ("CartItemChanged")

Time filters

Optionally, for any of the previous queries, you can include a from and/or to time filters to get only those events that happened inside that time range. You must use a string with a time in ISO format with any precision you like, for example:

  • from:"2021" : Events created on 2021 year or up.
  • from:"2021-02-12" to:"2021-02-13" : Events created during February 12th.
  • from:"2021-03-16T16:16:25.178" : Events created at that date and time, using millisecond precision, or later.

Time filters examples

A) Cart events from February 23rd to July 20th, 2021

query {
eventsByEntity(entity: Cart, from: "2021-02-23", to: "2021-07-20") {
type
entity
entityID
requestID
createdAt
value
}
}

B) CartItemChanged events from February 25th to February 28th, 2021

query {
eventsByType(type: CartItemChanged, from: "2021-02-25", to: "2021-02-28") {
type
entity
entityID
requestID
createdAt
value
}
}

Known limitations

  • Subscriptions don't work for the events API yet
  • You can only query events, but not write them through this API. Use a command for that.
  • Currently, only available on the AWS provider.

Filter & Pagination

Filtering a read model

The Booster GraphQL API provides support for filtering Read Models on queries and subscriptions.

Using the GraphQL API endpoint you can retrieve the schema of your application so you can see what are the filters for every Read Model and its properties. You can filter like this:

Searching for a specific Read Model by id

query {
ProductReadModels(filter: { id: { eq: "test-id" } }) {
id
sku
availability
price
}
}

Supported filters

The currently supported filters are the following ones:

Boolean filters

FilterValueDescription
eqtrue/falseEqual to
netrue/falseNot equal to

Example:

query {
ProductReadModels(filter: { availability: { eq: true } }) {
id
sku
availability
price
}
}

Number filters

FilterValueDescription
eqFloatEqual to
neFloatNot equal to
gtFloatGreater than
gteFloatGreater or equal than
ltFloatLower than
lteFloatLower or equal than
in[Float]Exists in given array

Example:

query {
ProductReadModels(filter: { price: { gt: 200 } }) {
id
sku
availability
price
}
}

String filters

FilterValueDescription
eqStringEqual to
neStringNot equal to
gtStringGreater than
gteStringGreater or equal than
ltStringLower than
lteStringLower or equal than
in[String]Exists in given array
beginsWithStringStarts with a given substr
containsStringContains a given substr
regex*StringRegular expression
iRegex*StringCase insensitive Regular expression

NOTE:

note

regex and iRegex are supported by Azure and Local Provider only

Example:

query {
ProductReadModels(filter: { sku: { begingsWith: "jewelry" } }) {
id
sku
availability
price
}
}
note

eq and ne are valid filters for checking if a field value is null or not null.

Array filters

FilterValueDescription
includesObjectIncludes a given object

Example:

query {
CartReadModels(filter: { itemsIds: { includes: "test-item" } }) {
id
price
itemsIds
}
}
note

Right now, with complex properties in Arrays, you just can filter them if you know the exact value of an element but is not possible to filter from a property of the element. As a workaround, you can use an array of ids of the complex property and filter for that property as in the example above.

Filter combinators

All the filters can be combined to create a more complex search on the same properties of the ReadModel.

FilterValueDescription
and[Filters]AND - all filters on the list have a match
or[Filters]OR - At least one filter of the list has a match
notFilter/and/orThe element does not match the filter

Example:

query {
CartReadModels(filter: { or: [{ id: { contains: "a" } }, { id: { contains: "b" } }] }) {
id
price
itemsIds
}
}

IsDefined operator

FilterValueDescription
isDefinedtrue/falsefield exists or not

Example:

query {
CartReadModels(filter: { price: { isDefined: true } }) {
id
price
itemsIds
}
}

Getting and filtering read models data at code level

Booster allows you to get your read models data in your commands handlers and event handlers using the Booster.readModel method.

For example, you can filter and get the total number of the products that meet your criteria in your commands like this:

@Command({
authorize: 'all',
})
export class GetProductsCount {
public constructor(readonly filters: Record<string, any>) {}

public static async handle(): Promise<void> {
const searcher = Booster.readModel(ProductReadModel)

searcher.filter({
sku: { contains: 'toy' },
or: [
{
description: { contains: 'fancy' },
},
{
description: { contains: 'great' },
},
],
})

const result = await searcher.search()
return { count: result.length }
}
}

Warning: Notice that ReadModels are eventually consistent objects that are calculated as all events in all entities that affect the read model are settled. You should not assume that a read model is a proper source of truth, so you shouldn't use this feature for data validations. If you need to query the most up-to-date current state, consider fetching your Entities, instead of ReadModels, with Booster.entity

Using sorting

Booster allows you to sort your read models data in your commands handlers and event handlers using the Booster.readModel method.

For example, you can sort and get the products in your commands like this:

{
ListCartReadModels(filter: {}, limit: 5, sortBy: {
shippingAddress: {
firstName: ASC
}
}) {
items {
id
cartItems
checks
shippingAddress {
firstName
}
payment {
cartId
}
cartItemsIds
}
cursor
}
}

This is a preview feature available only for some Providers and with some limitations:

  • Azure:
    • Sort by one field supported.
    • Nested fields supported.
    • Sort by more than one file: unsupported.
  • Local:
    • Sort by one field supported.
    • Nested fields supported.
    • Sort by more than one file: unsupported.

Warning: It is not possible to sort by fields defined as Interface, only classes or primitives types.

Using pagination

The Booster GraphQL API includes a type for your read models that stands for List{"your-read-model-name"}, which is the official way to work with pagination. Alternative, there is another type without the List prefix, which will be deprecated in future versions.

The Read Model List type includes some new parameters that can be used on queries:

  • limit; an integer that specifies the maximum number of items to be returned.

  • afterCursor; a parameter to set the cursor property returned by the previous query, if not null.

    Example:

query {
ListProductReadModels
(
limit: 1,
afterCursor: { id: "last-page-item"}
) {
id
sku
availability
price
}
}

Besides the parameters, this type also returns a type {your-read-model-name}Connection, it includes the following properties:

  • cursor; if there are more results to paginate, it will return the object to pass to the afterCursor parameter on the next query. If there aren't more items to be shown, it will be undefined.
  • items; the list of items returned by the query, if there aren't any, it will be an empty list.

Using Apollo Client

One of the best clients to connect to a GraphQL API is the Apollo client. There will probably be a version for your client technology of choice. These are the main ones:

We recommend referring to the documentation of those clients to know how to use them. Here is an example of how to fully instantiate the Javascript client so that it works for queries, mutations and subscriptions:

import { split, HttpLink } from '@apollo/client'
import { getMainDefinition } from '@apollo/client/utilities'
import { WebSocketLink } from '@apollo/client/link/ws'
import { ApolloClient, InMemoryCache } from '@apollo/client'
import { SubscriptionClient } from 'subscriptions-transport-ws'

// Helper function that checks if a GraphQL operation is a subscription or not
function isSubscriptionOperation({ query }) {
const definition = getMainDefinition(query)
return definition.kind === 'OperationDefinition' && definition.operation === 'subscription'
}

// Create an HTTP link for sending queries and mutations
const httpLink = new HttpLink({
uri: '<graphqlURL>',
})

// Create a SusbscriptionClient and a WebSocket link for sending subscriptions
const subscriptionClient = new SubscriptionClient('<websocketURL>', {
reconnect: true,
})
const wsLink = new WebSocketLink(subscriptionClient)

// Combine both links so that depending on the operation, it uses one or another
const splitLink = split(isSubscriptionOperation, wsLink, httpLink)

// Finally, create the client using the link created above
const client = new ApolloClient({
link: splitLink,
cache: new InMemoryCache(),
})

Now, we can send queries, mutations and subscriptions using the client instance:

import gql from 'graphql-tag'

// Query the CartReadModel
const readModelData = await client.query({
variables: {
cartID: 'demo',
},
query: gql`
query QueryCart($cartID: ID!) {
CartReadModel(id: $cartID) {
id
items
}
}
`,
})

// Send a command (mutation)
const commandResult = await client.mutate({
variables: {
cartID: 'demo',
sku: 'ABC_02',
},
mutation: gql`
mutation AddOneItemToCart($cartID: ID!, $sku: string!) {
ChangeCart(input: { cartId: $cartID, sku: $sku, quantity: 1 })
}
`,
})

// Subscribe to changes in the CartReadModel
const subscriptionOperation = client.subscribe({
variables: {
cartID: 'demo',
},
query: gql`
subscription SubscribeToCart($cartID: ID!) {
CartReadModel(id: $cartID) {
id
cartItems
}
}
`,
})

subscriptionOperation.subscribe({
next: (cartReadModel) => {
// This function is called everytime the CartReadModel with ID="demo" is changed
// Parameter "cartReadModel" contains the latest version of the cart
},
})

Authorizing operations

When you have a command or read model whose access is authorized to users with a specific set of roles (see Authentication and Authorization), you need to use an authorization token to send queries, mutations or subscriptions to that command or read model.

You can use Authentication Rocket to authorize operations, see its documentation and, more especifically, the Sign in section to know how to get a token. Once you have a token, the way to send it varies depending on the protocol you are using to send GraphQL operations:

  • For HTTP, you need to send the HTTP header Authorization with the token, making sure you prefix it with Bearer (the kind of token Booster uses). For example:
Authorization: Bearer <your token>
{ "type": "connection_init", "payload": { "Authorization": "<your token>" } }

You normally won't be sending tokens in such a low-level way. GraphQL clients have easier ways to send these tokens. See Sending tokens with Apollo client

Sending tokens with Apollo clients

We recommend going to the specific documentation of the specific Apollo client you are using to know how to send tokens. However, the basics of this guide remains the same. Here is an example of how you would configure the Javascript/Typescript Apollo client to send the authorization token. The example is exactly the same as the one shown in the Using Apollo clients section, but with the changes needed to send the token. Notice that <AuthApiEndpoint> and <idToken> are obtained from the Authentication Rocket.

import { split, HttpLink, ApolloLink } from '@apollo/client'
import { getMainDefinition } from '@apollo/client/utilities'
import { WebSocketLink } from '@apollo/client/link/ws'
import { ApolloClient, InMemoryCache } from '@apollo/client'
import { SubscriptionClient } from 'subscriptions-transport-ws'

function isSubscriptionOperation({ query }) {
const definition = getMainDefinition(query)
return definition.kind === 'OperationDefinition' && definition.operation === 'subscription'
}

// CHANGED: We now use the AuthApiEndpoint obtained by the auth rocket
const httpLink = new HttpLink({
uri: '<graphqlURL>',
})

// CHANGED: We create an "authLink" that modifies the operation by adding the token to the headers
const authLink = new ApolloLink((operation, forward) => {
operation.setContext({
headers: {
Authorization: 'Bearer <idToken>',
},
})
return forward(operation)
})

// <-- CHANGED: Concatenate the links so that the "httpLink" receives the operation with the headers set by the "authLink"
const httpLinkWithAuth = authLink.concat(httpLink)

const subscriptionClient = new SubscriptionClient('<websocketURL>', {
reconnect: true,
// CHANGED: added a "connectionParam" property with a function that returns the `Authorizaiton` header containing our token
connectionParams: () => {
return {
Authorization: 'Bearer <idToken>',
}
},
})
const wsLink = new WebSocketLink(subscriptionClient)

const splitLink = split(isSubscriptionOperation, wsLink, httpLinkWithAuth) // Note that we now are using "httpLinkWithAuth"

const client = new ApolloClient({
link: splitLink,
cache: new InMemoryCache(),
})

Refreshing tokens with Apollo clients

Authorization tokens expire after a certain amount of time. When a token is expired, you will get an error and you will need to call the refresh the token endpoint to get a new token. After you have done so, you need to use the new token in your GraphQL operations.

There are several ways to do this. Here we show the simplest one for learning purposes.

First, we modify the example shown in the section Sending tokens with apollo clients so that the token is stored in a global variable and the Apollo links get the token from it. That variable will be updated when the user signs-in and the token is refreshed:

import { split, HttpLink, ApolloLink } from '@apollo/client'
import { getMainDefinition } from '@apollo/client/utilities'
import { WebSocketLink } from '@apollo/client/link/ws'
import { ApolloClient, InMemoryCache } from '@apollo/client'
import { SubscriptionClient } from 'subscriptions-transport-ws'

let authToken = undefined // <-- CHANGED: This variable will hold the token and will be updated everytime the token is refreshed

function isSubscriptionOperation({ query }) {
const definition = getMainDefinition(query)
return definition.kind === 'OperationDefinition' && definition.operation === 'subscription'
}

const httpLink = new HttpLink({
uri: '<AuthApiEndpoint>',
})

const authLink = new ApolloLink((operation, forward) => {
if (authToken) {
operation.setContext({
headers: {
Authorization: `Bearer ${authToken}`, // <-- CHANGED: We use the "authToken" global variable
},
})
}
return forward(operation)
})

const httpLinkWithAuth = authLink.concat(httpLink)

const subscriptionClient = new SubscriptionClient('<websocketURL>', {
reconnect: true,
// CHANGED: added a "connectionParam" property with a function that returns the `Authorizaiton` header containing our token
connectionParams: () => {
if (authToken) {
return {
Authorization: `Bearer ${authToken}`, // <-- CHANGED: We use the "authToken" global variable
}
}
return {}
},
})
const wsLink = new WebSocketLink(subscriptionClient)

const splitLink = split(isSubscriptionOperation, wsLink, httpLinkWithAuth)

const client = new ApolloClient({
link: splitLink,
cache: new InMemoryCache(),
})

Now, when the user signs-in or when the token is refreshed, we need to do two things:

  1. Update the global variable authToken with the new token.
  2. Reconnect the socket used by the subscription client by doing subscriptionClient.close(false).

You might be wondering why we need to do the second step. The reason is that, with operations sent through HTTP, the token goes along with every operation, in the headers. However, with operations sent through WebSockets, like subscriptions, the token is only sent when the socket connection is established. For this reason, everytime we update the token we need to reconnect the SubscriptionClient so that it sends again the token (the updated one in this case).

The GraphQL over WebSocket protocol

Sockets are channels for two-way communication that doesn't follow the request-response cycle, a characteristic feature of the HTTP protocol. One part can send many messages and the other part can receive all of them but only answer to some specific ones. What is more, messages could come in any order. For example, one part can send two messages and receive the response of the second message before the response of the first message.

For these reasons, in order to have an effective non-trivial communication through sockets, a sub-protocol is needed. It would be in charge of making both parts understand each other, share authentication tokens, matching response to the corresponding requests, etc.

The Booster WebSocket communication uses the "GraphQL over WebSocket" protocol as subprotocol. It is in charge of all the low level stuff needed to properly send subscription operations to read models and receive the corresponding data.

You don't need to know anything about this to develop using Booster, neither in the backend side nor in the frontend side (as all the Apollo GraphQL clients uses this protocol), but it is good to know it is there to guarantee a proper communication. In case you are really curious, you can read about the protocol here.

note

The WebSocket communication in Booster only supports this subprotocol, whose identifier is graphql-ws. For this reason, when you connect to the WebSocket provisioned by Booster, you must specify the graphql-ws subprotocol. If not, the connection won't succeed.

+ \ No newline at end of file diff --git a/index.html b/index.html index 46c9152cc..65b9e49d3 100644 --- a/index.html +++ b/index.html @@ -6,13 +6,13 @@ Ask about Booster Framework | Booster Framework - +
-

Ask about Booster Framework

PrivateGPT · Free beta version
- +

Ask about Booster Framework

PrivateGPT · Free beta version
+ \ No newline at end of file diff --git a/introduction/index.html b/introduction/index.html index ce30c7a6a..05c690308 100644 --- a/introduction/index.html +++ b/introduction/index.html @@ -6,13 +6,13 @@ Introduction | Booster Framework - +
-

Introduction

Progress isn't made by early risers. It's made by lazy men trying to find easier ways to do something.Robert A. Heinlein

What is Booster?

Booster is the fastest way to create an application in the cloud. It is a new kind of framework to build scalable and reliable systems easier, reimagining the software development experience to maximize your team’s speed and reduce friction on every level.

Booster follows an Event-Driven and a Domain-Driven Design approach in which you define your application in terms that are understandable by anyone in your company. From a bird’s eye view, your project is organized into:

  • Commands: Define what a user can request from the system (i.e: Add an item to the cart)
  • Queries: Define what a user can get from the system (i.e: Get cart items)
  • Events: Simple records of facts (i.e: User X added item Y to the cart Z)
  • Entities: Data about the things that the people in your company talk about (i.e: Orders, Customers, etc.)
  • Handlers: Code that processes commands, reacts to events to trigger other actions, or update the entities after new events happen.

Events are the cornerstone of a Booster application, and that’s why we say that Booster is an event-driven framework. Events bring us many of the differentiating characteristics of Booster:

  • Real-time: Events can trigger other actions when they’re created, and updates can be pushed to connected clients without extra requests.
  • High data resiliency: Events are stored by default in an append-only database, so the data is never lost and it’s possible to recover any previous state of the system.
  • Scalable by nature: Dependencies only happen at data level, so Booster apps can ingest more data without waiting for other operatons to complete. Low coupling also makes it easier to evolve the code without affecting other parts of the system.
  • Asynchronous: Your users won't need to wait for your system to process the whole operation before continuing using it.

Before Booster, building an event-driven system with the mentioned characteristics required huge investments in hiring engineers with the needed expertise. Booster packs this expertise, acquired from real-case scenarios in high-scale companies, into a very simple tool that handles with the hard parts for you, even provisioning the infrastructure!

We have redesigned the whole developer experience from scratch, taking advantage of the advanced TypeScript type system and Serverless technologies to go from project generation to a production-ready application in the cloud which provides a real-time GraphQL API that can ingest thousands of concurrent users in a matter of minutes.

Booster's ultimate goal is making developer's lives easier, fulfilling the dream of writing code in a domain-driven way that eases communications for the whole team, without caring about how anything else is done at the infrastructure level!

Booster Principles

Booster enhances developers' productivity by focusing only on business logic. Write your code, provide your credentials and let Booster do the rest. Booster takes a holistic and highly-opinionated approach at many levels:

  • Focus on business value: The only code that makes sense is the code that makes your application different from any other.
  • Convention over configuration: All the supporting code and configuration that is similar in all applications should be out of programmers’ sight.
  • Truly Serverless: Why go Serverless to avoid managing infrastructure when you can implicitly infer your Serverless architecture from your code and not even deal with that?
  • Effective Multicloud: Booster design makes it possible to run the same application in AWS or Azure with no code changes in your application.
  • Scale smoothly: The code you write to handle your first 100 users will still work to handle your first million. You won't need to rewrite your application when it succeeds.
  • Event-source and CQRS: Our world is event-driven, businesses are event-driven, and modern software maps better to reality when it’s event-driven.
  • Principle of Abstraction: Building an application is hard enough to have to deal with recurring low-level details like SQL, API design, or authentication mechanisms, so we tend to build more semantic abstractions on top of them.
  • Real-time first: Client applications must be able to react to events happening in the backend and notice data changes.
  • Extensible: Rockets are what we call plugins in the Booster ecosystem. A rocket is a regular node package that works out of the box and provides new end-to-end abstractions, supports new cloud services, or pre-built functionalities that you can install in your project.

Why use Booster

What does Booster boost? Your team’s productivity. Not just because it helps you work faster, but because it makes you worry about fewer buttons and switches. We aim to solve major productivity sinks for developers like designing the right cloud infrastructure, writing APIs or dealing with ORMs.

Booster will fit like a glove in applications that are naturally event-driven like commerce applications (retail, e-commerce, omnichannel applications, warehouse management, etc.), business applications or communication systems, but it's a general-purpose framework that has several advantages over other solutions:

  • Faster time-to-market: Booster can deploy your application to a production-ready environment from minute one, without complicated configurations or needing to invest any effort to design it. In addition to that, it features a set of code generators to help developers build the project scaffolding faster and focus on actual business code in a matter of seconds instead of dealing with complicated framework folklore.
  • Write less code: Booster conventions and abstractions require less code to implement the same features. This not only speeds up development but combined with clear architecture guidelines also makes Booster projects easier to understand, iterate, and maintain.
  • Benefit from Typescript's advantages: Typescript's type system provides an important security layer that helps developers make sure the code they write is the code they meant to write, making Booster apps more reliable and less error-prone.
  • All the advantages of Microservices, none of its cons: Microservices are a great way to deal with code complexity, at least on paper. Services are isolated and can scale independently, and different teams can work independently, but that usually comes with a con: interfaces between services introduce huge challenges like delays, hard to solve cyclic dependencies, or deployment errors. In Booster, every handler function works as an independent microservice, it scales separately in its own lambda function, and there are no direct dependencies between them, all communication happens asynchronously via events, and all the infrastructure is compiled, type-checked and deployed atomically to avoid issues.
  • All the advantages of Serverless, without needing a degree in cloud technologies: Serverless technologies are amazing and have made a project like Booster possible, but they're relatively new technologies, and while day after day new tools appear to make them easier, the learning curve is still quite steep. With Booster you'll take advantage of Serverless’ main selling points of high scalability and reduced hosting costs, without having to learn every detail from minute one.
  • Event-sourcing by default: Booster keeps all incremental data changes as events, indefinitely. This means that any previous state of the system can be recreated and replayed at any moment, enabling a whole world of possibilities for troubleshooting and auditing, syncing environments or performing tests and simulations.
  • Booster makes it easy to build enterprise-grade applications: Implementing an event-sourcing system from scratch is a challenging exercise that usually requires highly specialized experts. There are some technical challenges like eventual consistency, message ordering, and snapshot building. Booster takes care of all of that and more for you, lowering the curve for people that are starting and making expert lives easier.
  • Choose your application cloud and avoid vendor lock-in: Booster provides a highly decoupled architecture that enables the possibility of integrating with ease new providers with different specifications, including a custom Multi-cloud provider, without affecting the framework specification.
- +

Introduction

Progress isn't made by early risers. It's made by lazy men trying to find easier ways to do something.Robert A. Heinlein

What is Booster?

Booster is the fastest way to create an application in the cloud. It is a new kind of framework to build scalable and reliable systems easier, reimagining the software development experience to maximize your team’s speed and reduce friction on every level.

Booster follows an Event-Driven and a Domain-Driven Design approach in which you define your application in terms that are understandable by anyone in your company. From a bird’s eye view, your project is organized into:

  • Commands: Define what a user can request from the system (i.e: Add an item to the cart)
  • Queries: Define what a user can get from the system (i.e: Get cart items)
  • Events: Simple records of facts (i.e: User X added item Y to the cart Z)
  • Entities: Data about the things that the people in your company talk about (i.e: Orders, Customers, etc.)
  • Handlers: Code that processes commands, reacts to events to trigger other actions, or update the entities after new events happen.

Events are the cornerstone of a Booster application, and that’s why we say that Booster is an event-driven framework. Events bring us many of the differentiating characteristics of Booster:

  • Real-time: Events can trigger other actions when they’re created, and updates can be pushed to connected clients without extra requests.
  • High data resiliency: Events are stored by default in an append-only database, so the data is never lost and it’s possible to recover any previous state of the system.
  • Scalable by nature: Dependencies only happen at data level, so Booster apps can ingest more data without waiting for other operatons to complete. Low coupling also makes it easier to evolve the code without affecting other parts of the system.
  • Asynchronous: Your users won't need to wait for your system to process the whole operation before continuing using it.

Before Booster, building an event-driven system with the mentioned characteristics required huge investments in hiring engineers with the needed expertise. Booster packs this expertise, acquired from real-case scenarios in high-scale companies, into a very simple tool that handles with the hard parts for you, even provisioning the infrastructure!

We have redesigned the whole developer experience from scratch, taking advantage of the advanced TypeScript type system and Serverless technologies to go from project generation to a production-ready application in the cloud which provides a real-time GraphQL API that can ingest thousands of concurrent users in a matter of minutes.

Booster's ultimate goal is making developer's lives easier, fulfilling the dream of writing code in a domain-driven way that eases communications for the whole team, without caring about how anything else is done at the infrastructure level!

Booster Principles

Booster enhances developers' productivity by focusing only on business logic. Write your code, provide your credentials and let Booster do the rest. Booster takes a holistic and highly-opinionated approach at many levels:

  • Focus on business value: The only code that makes sense is the code that makes your application different from any other.
  • Convention over configuration: All the supporting code and configuration that is similar in all applications should be out of programmers’ sight.
  • Truly Serverless: Why go Serverless to avoid managing infrastructure when you can implicitly infer your Serverless architecture from your code and not even deal with that?
  • Effective Multicloud: Booster design makes it possible to run the same application in AWS or Azure with no code changes in your application.
  • Scale smoothly: The code you write to handle your first 100 users will still work to handle your first million. You won't need to rewrite your application when it succeeds.
  • Event-source and CQRS: Our world is event-driven, businesses are event-driven, and modern software maps better to reality when it’s event-driven.
  • Principle of Abstraction: Building an application is hard enough to have to deal with recurring low-level details like SQL, API design, or authentication mechanisms, so we tend to build more semantic abstractions on top of them.
  • Real-time first: Client applications must be able to react to events happening in the backend and notice data changes.
  • Extensible: Rockets are what we call plugins in the Booster ecosystem. A rocket is a regular node package that works out of the box and provides new end-to-end abstractions, supports new cloud services, or pre-built functionalities that you can install in your project.

Why use Booster

What does Booster boost? Your team’s productivity. Not just because it helps you work faster, but because it makes you worry about fewer buttons and switches. We aim to solve major productivity sinks for developers like designing the right cloud infrastructure, writing APIs or dealing with ORMs.

Booster will fit like a glove in applications that are naturally event-driven like commerce applications (retail, e-commerce, omnichannel applications, warehouse management, etc.), business applications or communication systems, but it's a general-purpose framework that has several advantages over other solutions:

  • Faster time-to-market: Booster can deploy your application to a production-ready environment from minute one, without complicated configurations or needing to invest any effort to design it. In addition to that, it features a set of code generators to help developers build the project scaffolding faster and focus on actual business code in a matter of seconds instead of dealing with complicated framework folklore.
  • Write less code: Booster conventions and abstractions require less code to implement the same features. This not only speeds up development but combined with clear architecture guidelines also makes Booster projects easier to understand, iterate, and maintain.
  • Benefit from Typescript's advantages: Typescript's type system provides an important security layer that helps developers make sure the code they write is the code they meant to write, making Booster apps more reliable and less error-prone.
  • All the advantages of Microservices, none of its cons: Microservices are a great way to deal with code complexity, at least on paper. Services are isolated and can scale independently, and different teams can work independently, but that usually comes with a con: interfaces between services introduce huge challenges like delays, hard to solve cyclic dependencies, or deployment errors. In Booster, every handler function works as an independent microservice, it scales separately in its own lambda function, and there are no direct dependencies between them, all communication happens asynchronously via events, and all the infrastructure is compiled, type-checked and deployed atomically to avoid issues.
  • All the advantages of Serverless, without needing a degree in cloud technologies: Serverless technologies are amazing and have made a project like Booster possible, but they're relatively new technologies, and while day after day new tools appear to make them easier, the learning curve is still quite steep. With Booster you'll take advantage of Serverless’ main selling points of high scalability and reduced hosting costs, without having to learn every detail from minute one.
  • Event-sourcing by default: Booster keeps all incremental data changes as events, indefinitely. This means that any previous state of the system can be recreated and replayed at any moment, enabling a whole world of possibilities for troubleshooting and auditing, syncing environments or performing tests and simulations.
  • Booster makes it easy to build enterprise-grade applications: Implementing an event-sourcing system from scratch is a challenging exercise that usually requires highly specialized experts. There are some technical challenges like eventual consistency, message ordering, and snapshot building. Booster takes care of all of that and more for you, lowering the curve for people that are starting and making expert lives easier.
  • Choose your application cloud and avoid vendor lock-in: Booster provides a highly decoupled architecture that enables the possibility of integrating with ease new providers with different specifications, including a custom Multi-cloud provider, without affecting the framework specification.
+ \ No newline at end of file diff --git a/search/index.html b/search/index.html index add7e043b..9793ed3d6 100644 --- a/search/index.html +++ b/search/index.html @@ -6,13 +6,13 @@ Search the documentation | Booster Framework - +

Search the documentation

- + \ No newline at end of file diff --git a/security/authentication/index.html b/security/authentication/index.html index 68b09220c..ad9332076 100644 --- a/security/authentication/index.html +++ b/security/authentication/index.html @@ -6,13 +6,13 @@ Authentication | Booster Framework - +
-

Authentication

Booster uses the OAuth 2.0 protocol to authenticate users. That means that it uses tokens to identify users and authorize them. These tokens are called access tokens and are issued by an authentication provider. The most common authentication provider is Auth0, but you can use any other provider that supports OAuth 2.0.

Configuring the authentication provider

The first step to configure authentication in Booster is to configure the authentication provider. The provider must support OAuth 2.0 and must be able to issue access tokens. In order to validate incoming tokens and make sure that user requests come from trustable origins, you need to provide one or more TokenVerifier instances at config time for each of your environments.

The TokenVerifier class is a simple interface that you can implement to define your own token verifiers. Booster provides a JwksUriTokenVerifier class that you can use to configure a JWT token verifier. The JwksUriTokenVerifier constructor accepts the following parameters:

  • issuer: The issuer of the tokens. This is a mandatory parameter. This is commonly found in the token payload under the iss key.
  • jwksUri: The URL of the JSON Web Key Set (JWKS) that contains the public keys used to verify the tokens. This is a mandatory parameter. You can find more information about JWKS here.
  • rolesClaim: The name of the claim that contains the user roles. This is an optional parameter. If not provided, the roles claim will be used. This is commonly found in the token payload under the roles key.

Here is an example of how to configure a JwksUriTokenVerifier:

src/config/config.ts
import { Booster, JwksUriTokenVerifier } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'

Booster.configure('production', (config: BoosterConfig): void => {
config.appName = 'app-name'
config.providerPackage = '@boostercloud/framework-provider-x'
config.tokenVerifiers = [
new JwksUriTokenVerifier(
'https://my-auth0-tenant.auth0.com/', // Issuer
'https://my-auth0-tenant.auth0.com/.well-known/jwks.json', // JWKS URL
'role' // Roles claim
),
])
})
JWK Verifier

One common way to validate JWT tokens is by using a issuer-provided well-known URI on which you can find their JSON Web Key sets (JWKS). If you use this method, you only need to provide the issuer's name, the JWKS URI and, if you're using role-based authentication, an optional rolesClaim option that sets the claim from which Booster will read the role names.

JWKS URI glossary

Here you can find a list of the most common authentication providers and their corresponding issuer, JWKS URI and roles claim:

caution

The issuer and JWKS URI may change depending on the region you're using. Please check the provider's documentation to find the correct values for your use case.

The following list is not exhaustive and the information may be deprecated. If you want to add a new provider, or update an existing one, please open a PR to have this content up to date.

ProviderIssuerJWKS URI
Auth0https://<your-tenant>.auth0.com/https://<your-tenant>.auth0.com/.well-known/jwks.json
AWS Cognitohttps://cognito-idp.<region>.amazonaws.com/<user-pool-id>https://cognito-idp.<region>.amazonaws.com/<user-pool-id>/.well-known/jwks.json
Oktahttps://<your-tenant>.okta.com/oauth2/defaulthttps://<your-tenant>.okta.com/oauth2/default/v1/keys
Googlehttps://accounts.google.comhttps://www.googleapis.com/oauth2/v3/certs
Firebasehttps://accounts.google.comhttps://www.googleapis.com/oauth2/v3/certs

Public key based authentication

The PublicKeyTokenVerifier class uses the public key of the issuer to verify the token signature. This means that the issuer must provide a JWKS URI that can be used to verify the token signature. This is the most common way to verify tokens, but it's not the only one. If you want to use a different method, you can implement your own TokenVerifier class.

This is useful when the token issuer doesn't provide a JWKS URI, when you're implementing your own authentication mechanism or you're issuing self-signed tokens.

src/config/config.ts
import { Booster, PublicKeyTokenVerifier } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'

function publicKeyResolver(): Promise<string> {
// Your implementation here
}

Booster.configure('production', (config: BoosterConfig): void => {
config.appName = 'app-name'
config.providerPackage = '@boostercloud/framework-provider-x'
config.tokenVerifiers = [
new PublicKeyTokenVerifier(
'issuer-name', // Issuer name
publicKeyResolver(), // Promise that resolves to the public key string
'custom:roles' // Name of the claim to read the roles from (if you're using role-based authorization)
),
]
})
info

Notice that the publicKeyResolver is a promise that resolves to a string, so it can be used to load the public key from a remote location too (i.e. get it from your KMS).

tip

If you need to handle private keys in production, consider using a KMS (Key Management System). These systems often provide API endpoints that let you encrypt/sign your JWT tokens without exposing the private keys. The public keys can be set in a PublicKeyTokenVerifier to automate verification.

Custom authentication

If you want to implement your own authentication mechanism, you can implement your own TokenVerifier class. This class must implement the following interface:

interface TokenVerifier {
/**
* Verify asd deserialize a stringified token with this token verifier.
* @param token The token to verify
*/
verify(token: string): Promise<DecodedToken>

/**
* Build a valid `UserEnvelope` from a decoded token.
* @param decodedToken The decoded token
*/
toUserEnvelope(decodedToken: DecodedToken): UserEnvelope
}

Here is an example of how to implement a custom TokenVerifier:

src/config/config.ts
import { Booster, TokenVerifier } from '@boostercloud/framework-core'
import { BoosterConfig, DecodedToken, TokenVerifier, UserEnvelope } from '@boostercloud/framework-types'

class CustomTokenVerifier implements TokenVerifier {
public async verify(token: string): Promise<DecodedToken> {
// Your custom token verification logic here
}

public toUserEnvelope(decodedToken: DecodedToken): UserEnvelope {
// Your custom logic to build a UserEnvelope from a decoded token here
}
}

Booster.configure('production', (config: BoosterConfig): void => {
config.appName = 'app-name'
config.providerPackage = '@boostercloud/framework-provider-x'
config.tokenVerifiers = [new CustomTokenVerifier()]
})

Some use cases for this could be to check that the token was generated specifically for your service by inspecting the aud claim, or check that the token has not been blacklisted or invalidated by your business logic (i.e. a user logs out before the token's expiration date and is included in an invalidated tokens list to make sure that an attacker that finds the token later can't use it to impersonate the legitimate owner).

Extend existing token verifiers

If you only need to perform extra validations on top of one of the existing TokenVerifiers, you can extend one of the default implementations:

export class CustomValidator extends PrivateKeyValidator {
public async verify(token: string): Promise<UserEnvelope> {
// Call to the PrivateKeyValidator verify method to check the signature
const userEnvelope = await super.verify(token)

// Do my extra validations here. Throwing an error will reject the token
await myExtraValidations(userEnvelope.claims, token)

return userEnvelope
}
}

Advanced authentication

If you need to do more advanced checks, you can implement the whole verification algorithm yourself. For example, if you're using non-standard or legacy tokens. Booster exposes for convenience many of the utility functions that it uses in the default TokenVerifier implementations:

FunctionDescription
getJwksClientInitializes a jwksRSA client that can be used to get the public key of a JWKS URI using the getKeyWithClient function.
getKeyWithClientInitializes a function that can be used to get the public key from a JWKS URI with the signature required by the verifyJWT function. You can create a client using the getJwksClient function.
verifyJWTVerifies a JWT token using a key or key resolver function and returns a Booster UserEnvelope.
/**
* Initializes a jwksRSA client that can be used to get the public key of a JWKS URI using the
* `getKeyWithClient` function.
*/
export function getJwksClient(jwksUri: string) {
...
}

/**
* Initializes a function that can be used to get the public key from a JWKS URI with the signature
* required by the `verifyJWT` function. You can create a client using the `getJwksClient` function.
*/
export function getKeyWithClient(
client: jwksRSA.JwksClient,
header: jwt.JwtHeader,
callback: jwt.SigningKeyCallback
): void {
...
}

/**
* Verifies a JWT token using a key or key resolver function and returns a Booster UserEnvelope.
*/
export async function verifyJWT(
token: string,
issuer: string,
key: jwt.Secret | jwt.GetPublicKeyOrSecret,
rolesClaim?: string
): Promise<UserEnvelope> {
...
}
- +

Authentication

Booster uses the OAuth 2.0 protocol to authenticate users. That means that it uses tokens to identify users and authorize them. These tokens are called access tokens and are issued by an authentication provider. The most common authentication provider is Auth0, but you can use any other provider that supports OAuth 2.0.

Configuring the authentication provider

The first step to configure authentication in Booster is to configure the authentication provider. The provider must support OAuth 2.0 and must be able to issue access tokens. In order to validate incoming tokens and make sure that user requests come from trustable origins, you need to provide one or more TokenVerifier instances at config time for each of your environments.

The TokenVerifier class is a simple interface that you can implement to define your own token verifiers. Booster provides a JwksUriTokenVerifier class that you can use to configure a JWT token verifier. The JwksUriTokenVerifier constructor accepts the following parameters:

  • issuer: The issuer of the tokens. This is a mandatory parameter. This is commonly found in the token payload under the iss key.
  • jwksUri: The URL of the JSON Web Key Set (JWKS) that contains the public keys used to verify the tokens. This is a mandatory parameter. You can find more information about JWKS here.
  • rolesClaim: The name of the claim that contains the user roles. This is an optional parameter. If not provided, the roles claim will be used. This is commonly found in the token payload under the roles key.

Here is an example of how to configure a JwksUriTokenVerifier:

src/config/config.ts
import { Booster, JwksUriTokenVerifier } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'

Booster.configure('production', (config: BoosterConfig): void => {
config.appName = 'app-name'
config.providerPackage = '@boostercloud/framework-provider-x'
config.tokenVerifiers = [
new JwksUriTokenVerifier(
'https://my-auth0-tenant.auth0.com/', // Issuer
'https://my-auth0-tenant.auth0.com/.well-known/jwks.json', // JWKS URL
'role' // Roles claim
),
])
})
JWK Verifier

One common way to validate JWT tokens is by using a issuer-provided well-known URI on which you can find their JSON Web Key sets (JWKS). If you use this method, you only need to provide the issuer's name, the JWKS URI and, if you're using role-based authentication, an optional rolesClaim option that sets the claim from which Booster will read the role names.

JWKS URI glossary

Here you can find a list of the most common authentication providers and their corresponding issuer, JWKS URI and roles claim:

caution

The issuer and JWKS URI may change depending on the region you're using. Please check the provider's documentation to find the correct values for your use case.

The following list is not exhaustive and the information may be deprecated. If you want to add a new provider, or update an existing one, please open a PR to have this content up to date.

ProviderIssuerJWKS URI
Auth0https://<your-tenant>.auth0.com/https://<your-tenant>.auth0.com/.well-known/jwks.json
AWS Cognitohttps://cognito-idp.<region>.amazonaws.com/<user-pool-id>https://cognito-idp.<region>.amazonaws.com/<user-pool-id>/.well-known/jwks.json
Oktahttps://<your-tenant>.okta.com/oauth2/defaulthttps://<your-tenant>.okta.com/oauth2/default/v1/keys
Googlehttps://accounts.google.comhttps://www.googleapis.com/oauth2/v3/certs
Firebasehttps://accounts.google.comhttps://www.googleapis.com/oauth2/v3/certs

Public key based authentication

The PublicKeyTokenVerifier class uses the public key of the issuer to verify the token signature. This means that the issuer must provide a JWKS URI that can be used to verify the token signature. This is the most common way to verify tokens, but it's not the only one. If you want to use a different method, you can implement your own TokenVerifier class.

This is useful when the token issuer doesn't provide a JWKS URI, when you're implementing your own authentication mechanism or you're issuing self-signed tokens.

src/config/config.ts
import { Booster, PublicKeyTokenVerifier } from '@boostercloud/framework-core'
import { BoosterConfig } from '@boostercloud/framework-types'

function publicKeyResolver(): Promise<string> {
// Your implementation here
}

Booster.configure('production', (config: BoosterConfig): void => {
config.appName = 'app-name'
config.providerPackage = '@boostercloud/framework-provider-x'
config.tokenVerifiers = [
new PublicKeyTokenVerifier(
'issuer-name', // Issuer name
publicKeyResolver(), // Promise that resolves to the public key string
'custom:roles' // Name of the claim to read the roles from (if you're using role-based authorization)
),
]
})
info

Notice that the publicKeyResolver is a promise that resolves to a string, so it can be used to load the public key from a remote location too (i.e. get it from your KMS).

tip

If you need to handle private keys in production, consider using a KMS (Key Management System). These systems often provide API endpoints that let you encrypt/sign your JWT tokens without exposing the private keys. The public keys can be set in a PublicKeyTokenVerifier to automate verification.

Custom authentication

If you want to implement your own authentication mechanism, you can implement your own TokenVerifier class. This class must implement the following interface:

interface TokenVerifier {
/**
* Verify asd deserialize a stringified token with this token verifier.
* @param token The token to verify
*/
verify(token: string): Promise<DecodedToken>

/**
* Build a valid `UserEnvelope` from a decoded token.
* @param decodedToken The decoded token
*/
toUserEnvelope(decodedToken: DecodedToken): UserEnvelope
}

Here is an example of how to implement a custom TokenVerifier:

src/config/config.ts
import { Booster, TokenVerifier } from '@boostercloud/framework-core'
import { BoosterConfig, DecodedToken, TokenVerifier, UserEnvelope } from '@boostercloud/framework-types'

class CustomTokenVerifier implements TokenVerifier {
public async verify(token: string): Promise<DecodedToken> {
// Your custom token verification logic here
}

public toUserEnvelope(decodedToken: DecodedToken): UserEnvelope {
// Your custom logic to build a UserEnvelope from a decoded token here
}
}

Booster.configure('production', (config: BoosterConfig): void => {
config.appName = 'app-name'
config.providerPackage = '@boostercloud/framework-provider-x'
config.tokenVerifiers = [new CustomTokenVerifier()]
})

Some use cases for this could be to check that the token was generated specifically for your service by inspecting the aud claim, or check that the token has not been blacklisted or invalidated by your business logic (i.e. a user logs out before the token's expiration date and is included in an invalidated tokens list to make sure that an attacker that finds the token later can't use it to impersonate the legitimate owner).

Extend existing token verifiers

If you only need to perform extra validations on top of one of the existing TokenVerifiers, you can extend one of the default implementations:

export class CustomValidator extends PrivateKeyValidator {
public async verify(token: string): Promise<UserEnvelope> {
// Call to the PrivateKeyValidator verify method to check the signature
const userEnvelope = await super.verify(token)

// Do my extra validations here. Throwing an error will reject the token
await myExtraValidations(userEnvelope.claims, token)

return userEnvelope
}
}

Advanced authentication

If you need to do more advanced checks, you can implement the whole verification algorithm yourself. For example, if you're using non-standard or legacy tokens. Booster exposes for convenience many of the utility functions that it uses in the default TokenVerifier implementations:

FunctionDescription
getJwksClientInitializes a jwksRSA client that can be used to get the public key of a JWKS URI using the getKeyWithClient function.
getKeyWithClientInitializes a function that can be used to get the public key from a JWKS URI with the signature required by the verifyJWT function. You can create a client using the getJwksClient function.
verifyJWTVerifies a JWT token using a key or key resolver function and returns a Booster UserEnvelope.
/**
* Initializes a jwksRSA client that can be used to get the public key of a JWKS URI using the
* `getKeyWithClient` function.
*/
export function getJwksClient(jwksUri: string) {
...
}

/**
* Initializes a function that can be used to get the public key from a JWKS URI with the signature
* required by the `verifyJWT` function. You can create a client using the `getJwksClient` function.
*/
export function getKeyWithClient(
client: jwksRSA.JwksClient,
header: jwt.JwtHeader,
callback: jwt.SigningKeyCallback
): void {
...
}

/**
* Verifies a JWT token using a key or key resolver function and returns a Booster UserEnvelope.
*/
export async function verifyJWT(
token: string,
issuer: string,
key: jwt.Secret | jwt.GetPublicKeyOrSecret,
rolesClaim?: string
): Promise<UserEnvelope> {
...
}
+ \ No newline at end of file diff --git a/security/authorization/index.html b/security/authorization/index.html index b3b7f0eed..7b89d2203 100644 --- a/security/authorization/index.html +++ b/security/authorization/index.html @@ -6,7 +6,7 @@ Authorization | Booster Framework - + @@ -17,8 +17,8 @@ sign-up, sign-in, passwordless tokens, change password and many other features. When a user goes through the sign up and sign in mecanisms provided by the rocket, they'll get a standard JWT access token that can be included in any request as a Bearer token and will work in the same way as any other JWT token.

When you use this rocket, you can use extra configuration parameters in the @Role decorator to enable some of these features. In the following example we define Admin, User, SuperUser and SuperUserWithoutConfirmation roles. They all contain an extra auth configuration attribute that set the behavior of the authorization role for each role:

@Role({
auth: {
signUpMethods: [], // Using an empty array here prevents sign-ups (Admin has no special treatment. If you don't enable signup, you'll need to create the first admin manually in the AWS console)
},
})
export class Admin {}

@Role({
auth: {
signUpMethods: ['email'], // Enable email sign-ups for Users
},
})
export class User {}

@Role({
auth: {
signUpMethods: ['email', 'phone'], // Can sign up by email or phone
skipConfirmation: false, // It requires email or phone confirmation. The rocket will send either an email or a SMS with a confirmation link.
},
})
export class SuperUser {}

@Role({
auth: {
signUpMethods: ['email', 'phone'],
skipConfirmation: true, // It doesn't require email or phone confirmation
},
})
export class SuperUserWithoutConfirmation {}

To learn more about the Authorization rocket for AWS, please read the README in its Github repository.

Custom authorization functions

Booster also allows you to implement your own authorization functions, in case the role-based authorization model doesn't work for your application. In order to apply your own authorization functions, you need to provide them in the authorize field of the command or read model. As authorization functions are regular -JavaScript functions, you can easily reuse them in your project or even in other Booster projects as a library.

Command Authorizers

As mentioned, the authorize parameter of the @Command can receive a function. However, this function must match the CommandAuthorizer type. This function will receive two parameters and return a Promise that will resolve if the user is authorized to execute the command or reject if not:

export type CommandAuthorizer = (currentUser?: UserEnvelope, input?: CommandInput) => Promise<void>
ParameterTypeDescription
currentUserUserEnvelopeUser data decoded from the provided token
inputCommandInputThe input of the command

For instance, if you want to restrict a command to users that have a permission named Permission-To-Rock in the permissions claim you can do this:


const CustomCommandAuthorizer: CommandAuthorizer = async (currentUser) => {
if (!currentUser.claims['permissions'].includes('Permission-To-Rock')) {
throw new Error(`User ${currentUser.username} should not be rocking!`) // <- This will reject the access to the command
}
}

@Command({
authorize: CustomCommandAuthorizer,
})
export class PerformIncredibleGuitarSolo {
...
}

Read Model Authorizers

As with commands, the authorize parameter of the @ReadModel decorator can also receive a function. However, this function must match the ReadModelAuthorizer type. This function will receive two parameters and return a Promise that will resolve if the user is authorized to execute the command or reject if not:

export type ReadModelAuthorizer<TReadModel extends ReadModelInterface> = (
currentUser?: UserEnvelope,
readModelRequestEnvelope?: ReadModelRequestEnvelope<TReadModel>
) => Promise<void>
ParameterTypeDescription
currentUserUserEnvelopeUser data decoded from the provided token
inputCommandInputThe input of the command

For instance, you may want to restrict access to a specific resource only to people that has been granted read permission:

const CustomReadModelAuthorizer: ReadModelAuthorizer = async (currentUser, readModelRequestEnvelope) => {
const userPermissions = Booster.entity(UserPermissions, currentUser.username)
if (!userPermissions || !userPermissions.accessTo[readModelRequestEnvelope.className].includes(readModelRequestEnvelope.key.id)) {
throw new Error(`User ${currentUser.username} should not be looking here`)
}
}

@ReadModel({
authorize: CustomReadModelAuthorizer
})

Event Stream Authorizers

You can restrict the access to the Event Stream of an Entity by providing an authorizeReadEvents function in the @Entity decorator. This function is called every time an event stream is requested. The function must match the EventStreamAuthorizer type receives the current user and the event search request as parameters. The function must return a Promise<void>. If the promise is rejected, the request will be denied. If the promise is resolved successfully, the request will be allowed.

export type EventStreamAuthorizer = (
currentUser?: UserEnvelope,
eventSearchRequest?: EventSearchRequest
) => Promise<void>

For instance, you can restrict access to entities that the current user own.

const CustomEventAuthorizer: EventStreamAuthorizer = async (currentUser, eventSearchRequest) => {
const { entityID } = eventSearchRequest.parameters
if (!entityID) {
throw new Error(`${currentUser.username} cannot list carts`)
}
const cart = Booster.entity(Cart, entityID)
if (cart.ownerUserName !== currentUser.userName) {
throw new Error(`${currentUser.username} cannot see events in cart ${entityID}`)
}
}


@Entity({
authorizeReadEvents: CustomEventAuthorizer
})
export class Cart {
public constructor(
readonly id: UUID,
readonly ownerUserName: string,
readonly cartItems: Array<CartItem>,
public shippingAddress?: Address,
public checks = 0
) {}
...
}
- +JavaScript functions, you can easily reuse them in your project or even in other Booster projects as a library.

Command Authorizers

As mentioned, the authorize parameter of the @Command can receive a function. However, this function must match the CommandAuthorizer type. This function will receive two parameters and return a Promise that will resolve if the user is authorized to execute the command or reject if not:

export type CommandAuthorizer = (currentUser?: UserEnvelope, input?: CommandInput) => Promise<void>
ParameterTypeDescription
currentUserUserEnvelopeUser data decoded from the provided token
inputCommandInputThe input of the command

For instance, if you want to restrict a command to users that have a permission named Permission-To-Rock in the permissions claim you can do this:


const CustomCommandAuthorizer: CommandAuthorizer = async (currentUser) => {
if (!currentUser.claims['permissions'].includes('Permission-To-Rock')) {
throw new Error(`User ${currentUser.username} should not be rocking!`) // <- This will reject the access to the command
}
}

@Command({
authorize: CustomCommandAuthorizer,
})
export class PerformIncredibleGuitarSolo {
...
}

Read Model Authorizers

As with commands, the authorize parameter of the @ReadModel decorator can also receive a function. However, this function must match the ReadModelAuthorizer type. This function will receive two parameters and return a Promise that will resolve if the user is authorized to execute the command or reject if not:

export type ReadModelAuthorizer<TReadModel extends ReadModelInterface> = (
currentUser?: UserEnvelope,
readModelRequestEnvelope?: ReadModelRequestEnvelope<TReadModel>
) => Promise<void>
ParameterTypeDescription
currentUserUserEnvelopeUser data decoded from the provided token
inputCommandInputThe input of the command

For instance, you may want to restrict access to a specific resource only to people that has been granted read permission:

const CustomReadModelAuthorizer: ReadModelAuthorizer = async (currentUser, readModelRequestEnvelope) => {
const userPermissions = Booster.entity(UserPermissions, currentUser.username)
if (!userPermissions || !userPermissions.accessTo[readModelRequestEnvelope.className].includes(readModelRequestEnvelope.key.id)) {
throw new Error(`User ${currentUser.username} should not be looking here`)
}
}

@ReadModel({
authorize: CustomReadModelAuthorizer
})

Event Stream Authorizers

You can restrict the access to the Event Stream of an Entity by providing an authorizeReadEvents function in the @Entity decorator. This function is called every time an event stream is requested. The function must match the EventStreamAuthorizer type receives the current user and the event search request as parameters. The function must return a Promise<void>. If the promise is rejected, the request will be denied. If the promise is resolved successfully, the request will be allowed.

export type EventStreamAuthorizer = (
currentUser?: UserEnvelope,
eventSearchRequest?: EventSearchRequest
) => Promise<void>

For instance, you can restrict access to entities that the current user own.

const CustomEventAuthorizer: EventStreamAuthorizer = async (currentUser, eventSearchRequest) => {
const { entityID } = eventSearchRequest.parameters
if (!entityID) {
throw new Error(`${currentUser.username} cannot list carts`)
}
const cart = Booster.entity(Cart, entityID)
if (cart.ownerUserName !== currentUser.userName) {
throw new Error(`${currentUser.username} cannot see events in cart ${entityID}`)
}
}


@Entity({
authorizeReadEvents: CustomEventAuthorizer
})
export class Cart {
public constructor(
readonly id: UUID,
readonly ownerUserName: string,
readonly cartItems: Array<CartItem>,
public shippingAddress?: Address,
public checks = 0
) {}
...
}
+ \ No newline at end of file diff --git a/security/security/index.html b/security/security/index.html index c245143c6..8c6e411fa 100644 --- a/security/security/index.html +++ b/security/security/index.html @@ -6,7 +6,7 @@ Security | Booster Framework - + @@ -14,8 +14,8 @@ - +authorizer functions.

+ \ No newline at end of file