Skip to content

Commit a53eed3

Browse files
authored
Build Docker Container for the .NET Sample (#33)
* renaming the local param to make it more readable * typo and add build step * await task * change to nuget reference for docker build * copy the package file to the local nuget folder * change the build with the new namespace * add the local nuget into source control * add the deployment step * forgot to push the readme * change the csproj * add docker file * Update .dockerignore * Update dotnet-azurefunction.csproj * Update DaprExtensionTests.csproj
1 parent 1416104 commit a53eed3

14 files changed

+384
-20
lines changed

Dockerfile

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
FROM mcr.microsoft.com/dotnet/core/sdk:3.0 AS installer-env
2+
3+
# Copy the DaprExtension, style cop, and dotnet sample into the installer-env to build
4+
COPY /src/DaprExtension /src/src/DaprExtension
5+
COPY /.stylecop /src/.stylecop
6+
COPY /samples/dotnet-azurefunction /src/samples/dotnet-function-app
7+
8+
# Build project
9+
RUN cd /src/samples/dotnet-function-app && \
10+
mkdir -p /home/site/wwwroot && \
11+
dotnet publish *.csproj --output /home/site/wwwroot
12+
13+
# To enable ssh & remote debugging on app service change the base image to the one below
14+
# FROM mcr.microsoft.com/azure-functions/dotnet:3.0-appservice
15+
FROM mcr.microsoft.com/azure-functions/dotnet:3.0
16+
ENV AzureWebJobsScriptRoot=/home/site/wwwroot \
17+
AzureFunctionsJobHost__Logging__Console__IsEnabled=true
18+
19+
COPY --from=installer-env ["/home/site/wwwroot", "/home/site/wwwroot"]
Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1,5 @@
1-
local.settings.json
1+
local.settings.json
2+
deploy/
3+
README.md
4+
.gitignore
5+
.dockerignore

samples/dotnet-azurefunction/README.md

Lines changed: 252 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -119,7 +119,7 @@ POST http://localhost:3501/v1.0/invoke/functionapp/method/CreateNewOrder
119119
**Note**: in this sample, `DaprServiceInvocationTrigger` attribute does not specify the method name, so it defaults to use the FunctionName. Alternatively, we can use `[DaprServiceInvocationTrigger(MethodName = "newOrder")]` to specify the service invocation method name that your function should respond. In this case, then we need to use the following command:
120120

121121
```powershell
122-
dapr invoke --app-id nodeapp --method newOrder --payload "{\"data\": { \"orderId\": \"41\" } }"
122+
dapr invoke --app-id functionapp --method newOrder --payload "{\"data\": { \"orderId\": \"41\" } }"
123123
```
124124

125125
In your terminal window, you should see logs indicating that the message was received and state was updated:
@@ -274,5 +274,254 @@ To stop your services from running, simply stop the "dapr run" process. Alternat
274274

275275
```bash
276276
dapr stop --app-id functionapp
277-
dapr stop --app-id nodeapp
278-
```
277+
```
278+
279+
# Build the Docker Container for Your Function App
280+
281+
Now that you're successfully having your Dapr'd function app with running locally, you probably want to deploy to kubernetes cluster. If you have update the sample code to fit your scenario, you need to create new images with your updated code. First you need to install docker on your machine. Next, follow these steps to build your custom container image for your function:
282+
283+
1. Update function app as you see fit!
284+
2. There are two ways you can build the docker images. In this dotnet sample, the project file has a **project reference** for the `Dapr.AzureFunctions.Extension`, instead of a **nuget reference**.
285+
286+
### Approach 1: Using a Project Reference
287+
288+
1a. Go to the root directory of this repo, you should see a `dockerfile` under `/azure-functions-extension` folder.
289+
290+
2a. Continue step 3
291+
292+
### Approach 2: Using a Nuget Reference
293+
294+
1b. Navigate to `/dotnet-azurefunction` directory. You should see the default `Dockerfile` provided by Azure Functions which specify the suitable custom container for use and the selected runtime. Please check [here](https://hub.docker.com/_/microsoft-azure-functions-base) for more information on supported base image.
295+
296+
2b. Change the csproj file to use the nuget package. It will try to resolve the Dapr Extension package reference from the local nuget source which points to the `localnuget` folder. See the definition in `nuget.config` file.
297+
298+
3b. Copy the lastest `.nupkg` file from `$RepoRoot/bin/Debug/nugets` or `$RepoRoot/bin/Release/nugets` into `/dotnet-azurefunction/localNuget` folder.
299+
300+
3. Run docker build command and specify your image name:
301+
```
302+
docker build -t my-docker-id .
303+
```
304+
If you're planning on hosting it on docker hub, then it should be
305+
306+
```
307+
docker build -t my-docker-id/mydocker-image .
308+
```
309+
310+
4. Once your image has built you can see it on your machines by running `docker images`. Try run the image in a local container to test the build. Please use `-e` option to specify the app settings. Open a browser to http://localhost:8080, which should show your function app is up and running with `;-)`. You can ignore the storage connection to test this, but you might see exception thrown from your container log complaining storage is not defined.
311+
```
312+
docker run -e AzureWebjobStorage='connection-string` -e StateStoreName=statestore -e KafkaBindingName=sample-topic -p 8080:80 my-docker-id/mydocker-image
313+
```
314+
315+
5. To publish your docker image to docker hub (or another registry), first login: `docker login`. Then run `docker push my-docker-id/mydocker-image`.
316+
6. Update your .yaml file to reflect the new image name.
317+
7. Deploy your updated Dapr enabled app: `kubectl apply -f <YOUR APP NAME>.yaml`.
318+
319+
320+
# Deploy Dapr'd Function App into Kubernetes
321+
Next step, we will show steps to get your Dapr'd function app running in a Kubernetes cluster.
322+
323+
## Prerequisites
324+
Since our sample does cover multiple Dapr components, here we have a long list of requirements. Please skip any step that is not required for your own function app.
325+
- Install [kubectl](https://kubernetes.io/docs/tasks/tools/install-kubectl/)
326+
- Install [helm](https://helm.sh/docs/intro/install/) (you can skip this if your function app does not use Kafka bindings)
327+
- A Kubernetes cluster, such as [Minikube](https://github.com/dapr/docs/blob/master/getting-started/cluster/setup-minikube.md), [AKS](https://github.com/dapr/docs/blob/master/getting-started/cluster/setup-aks.md) or [GKE](https://cloud.google.com/kubernetes-engine/)
328+
- A [Azure Storage Account](https://docs.microsoft.com/en-us/azure/storage/common/storage-account-create?tabs=azure-portal) to host your function app
329+
- Follow this guide to [find out the connection string](https://docs.microsoft.com/en-us/azure/storage/common/storage-configure-connection-string#configure-a-connection-string-for-an-azure-storage-account).
330+
- A State Store, such as [Redis Store](https://github.com/dapr/docs/blob/master/howto/configure-redis/README.md) for Dapr state store and pub/sub message delivery (you can skip this if your function does not use the aforementioned components)
331+
332+
## Setup Dapr on your Kubernetes Cluster
333+
Once you have a cluster, run `dapr init --kubernetes` to deploy Dapr to it. Please follow this( guide on [how to install Dapr on your kubrtnetes](https://github.com/dapr/docs/blob/master/getting-started/environment-setup.md#installing-dapr-on-a-kubernetes-cluster) via Dapr CLI or Helm. Dapr CLI does not support non-default namespaces and only is recommended for testing purposes.
334+
If you need a non-default namespace or in production environment, Helm has to be used.
335+
336+
```
337+
⌛ Making the jump to hyperspace...
338+
✅ Deploying the Dapr Operator to your cluster...
339+
✅ Success! Dapr has been installed. To verify, run 'kubectl get pods -w' in your terminal
340+
```
341+
## Deploy your Dapr Building Blocks
342+
#### [Optional] Configure the State Store
343+
- Replace the hostname and password in `deploy/redis.yaml`. https://github.com/dapr/samples/tree/master/2.hello-kubernetes#step-2---create-and-configure-a-state-store
344+
- Run `kubectl apply -f ./deploy/redis.yaml` and observe that your state store was successfully configured!
345+
```
346+
component.dapr.io/statestore configured
347+
```
348+
- Follow [secret management](https://github.com/dapr/docs/tree/master/concepts/secrets) instructions to securely manage your secrets in a production-grade application.
349+
- More detail can be found in Dapr sample repo [2.hello-kubernetes](https://github.com/dapr/samples/tree/master/2.hello-kubernetes#step-2---create-and-configure-a-state-store)
350+
351+
352+
#### [Optional] Setting up a Kafka in Kubernetes
353+
- Install Kafka via incubator/kafka helm
354+
```
355+
helm repo add incubator http://storage.googleapis.com/kubernetes-charts-incubator
356+
helm repo update
357+
kubectl create ns kafka
358+
helm install dapr-kafka incubator/kafka --namespace kafka -f ./kafka-non-persistence.yaml
359+
```
360+
- Run `kubectl -n kafka get pods -w` to see Kafka pods are running. This might take a few minute, but you should see.
361+
```
362+
NAME READY STATUS RESTARTS AGE
363+
dapr-kafka-0 1/1 Running 0 2m7s
364+
dapr-kafka-zookeeper-0 1/1 Running 0 2m57s
365+
dapr-kafka-zookeeper-1 1/1 Running 0 2m13s
366+
dapr-kafka-zookeeper-2 1/1 Running 0 109s
367+
```
368+
- Run `kubectl apply -f .\deploy\kafka.yaml` and observe that your kafka was successfully configured!
369+
```
370+
component.dapr.io/sample-topic created
371+
```
372+
- Follow [secret management](https://github.com/dapr/docs/tree/master/concepts/secrets) instructions to securely manage your secrets in a production-grade application.
373+
374+
#### [Optional] Setting up the Pub/Sub in Kubernetes
375+
- In this demo, we use Redis Stream (Redis Version 5 and above) to enable pub/sub. Replace the hostname and password in `deploy/redis-pubsub.yaml`. https://github.com/dapr/samples/tree/master/2.hello-kubernetes#step-2---create-and-configure-a-state-store
376+
- Run `kubectl apply -f .\deploy\redis.yaml` and observe that your state store was successfully configured!
377+
```
378+
component.dapr.io/messagebus configured
379+
```
380+
- See Dapr sample repo [4.pub-sub](https://github.com/dapr/samples/tree/master/4.pub-sub) for more instructions.
381+
382+
Now you should have all Dapr components up and running in your kubernetes cluster. Next we will show how to deploy your function app into your kubernetes cluster with the Dapr Side Car.
383+
384+
## Deploy your Dapr'd Function App
385+
You can find your function app deployment file `deploy/function.yaml`. Let's take a look:
386+
387+
```yaml
388+
kind: Secret
389+
apiVersion: v1
390+
metadata:
391+
name: functionapp
392+
namespace: default
393+
data:
394+
AzureWebJobsStorage: Base64EncodedConnectionString
395+
StateStoreName: c3RhdGVzdG9yZQ==
396+
KafkaBindingName: c2FtcGxlLXRvcGlj
397+
```
398+
- Put your app settings into `data` block. Please note the value has to be Base64 encoded. For example, the `StateStoreName` value is configured to be `statestore` in `deploy/redis.yaml`, string `statestore` get encoded into `c3RhdGVzdG9yZQ==`.
399+
- The connection string you retrieved should be formatted as `DefaultEndpointsProtocol=https;AccountName=storagesample;AccountKey=<account-key>`, which would be encoded into `RGVmYXVsdEVuZHBvaW50c1Byb3RvY29sPWh0dHBzO0FjY291bnROYW1lPXN0b3JhZ2VzYW1wbGU7QWNjb3VudEtleT08YWNjb3VudC1rZXk+`
400+
401+
In the second part of the deployment file, you need to put your image name and specify your app port where your Dapr Trigger will listen on.
402+
403+
```yaml
404+
apiVersion: apps/v1
405+
kind: Deployment
406+
metadata:
407+
name: functionapp
408+
labels:
409+
app: functionapp
410+
spec:
411+
replicas: 1
412+
selector:
413+
matchLabels:
414+
app: functionapp
415+
template:
416+
metadata:
417+
labels:
418+
app: functionapp
419+
annotations:
420+
dapr.io/enabled: "true"
421+
dapr.io/id: "functionapp"
422+
dapr.io/port: "<app-port>"
423+
spec:
424+
containers:
425+
- name: functionapp
426+
image: <your-docker-hub-id>/<your-image-name>
427+
ports:
428+
- containerPort: <app-port>
429+
imagePullPolicy: Always
430+
envFrom:
431+
- secretRef:
432+
name: functionapp
433+
```
434+
435+
Now run the following command to deploy the function app into your kubernetes cluster.
436+
437+
``` powershell
438+
$ kubectl apply -f ./deploy/functionapp.yaml
439+
440+
secret/functionapp created
441+
deployment.apps/functionapp created
442+
```
443+
444+
Run `kubectl get pods` to see your function app is up and running.
445+
```
446+
NAME READY STATUS RESTARTS AGE
447+
dapr-operator-64b94c8b85-jtbpn 1/1 Running 0 10m
448+
dapr-placement-844cf4c696-2mv88 1/1 Running 0 10m
449+
dapr-sentry-7c8fff7759-zwph2 1/1 Running 0 10m
450+
dapr-sidecar-injector-675df889d5-22wxr 1/1 Running 0 10m
451+
functionapp-6d4cc6b7f7-2p9n9 2/2 Running 0 8s
452+
```
453+
454+
## Test your Dapr'd Function App
455+
Now let's try invoke our function. You can use the follwoing commad to the logs. Use `--tail` to specify the last `n` lines of logs.
456+
```powershell
457+
kubectl logs --selector=app=functionapp -c functionapp --tail=50
458+
```
459+
460+
461+
In order to hit your function app endpoint, you can use port forwarding. Use the pod name for your function app.
462+
```
463+
kubectl port-forward functionapp-6d4cc6b7f7-2p9n9 {port-of-your-choice}:3001
464+
```
465+
Now similar to what we have done when testing locally, use any of your preferred tool to send HTTP request. Here we use the Rest Client Plugin.
466+
467+
``` http
468+
POST http://localhost:{port-of-your-choice}/CreateNewOrder
469+
470+
{
471+
"data": {
472+
"orderId": 41
473+
}
474+
}
475+
```
476+
477+
``` http
478+
POST http://localhost:{port-of-your-choice}/RetrieveOrder
479+
```
480+
481+
``` http
482+
POST http://localhost:{port-of-your-choice}/SendMessageToKafka
483+
484+
{"message": "hello!" }
485+
486+
```
487+
Run kubectl logs command to retrieve the latest log. You should see your function app is getting invoked as you have seen when testing locally.
488+
489+
``` powershell
490+
: Function.RetrieveOrder[0]
491+
Executing 'RetrieveOrder' (Reason='', Id=0f378098-d15a-4f13-81ea-20caee7ae10c)
492+
: Function.RetrieveOrder.User[0]
493+
C# function processed a RetrieveOrder request from the Dapr Runtime.
494+
: Function.RetrieveOrder.User[0]
495+
{"orderId":41}
496+
: Function.RetrieveOrder[0]
497+
Executed 'RetrieveOrder' (Succeeded, Id=0f378098-d15a-4f13-81ea-20caee7ae10c)
498+
499+
: Function.CreateNewOrder[0]
500+
Executing 'CreateNewOrder' (Reason='', Id=faa53523-85c3-41cb-808c-02d47cb7dcdc)
501+
: Function.CreateNewOrder.User[0]
502+
C# function processed a CreateNewOrder request from the Dapr Runtime.
503+
: Function.CreateNewOrder[0]
504+
Executed 'CreateNewOrder' (Succeeded, Id=faa53523-85c3-41cb-808c-02d47cb7dcdc)
505+
506+
: Function.SendMessageToKafka.User[0]
507+
C# HTTP trigger function processed a request.
508+
: Function.SendMessageToKafka[0]
509+
Executed 'SendMessageToKafka' (Succeeded, Id=5aa8e383-9c8b-4686-90a7-089d71118d81)
510+
511+
: Function.ConsumeMessageFromKafka[0]
512+
Executing 'ConsumeMessageFromKafka' (Reason='', Id=aa8d92a6-2da1-44ff-a033-cb217b9c29541)
513+
: Function.ConsumeMessageFromKafka.User[0]
514+
Hello from Kafka!
515+
: Function.ConsumeMessageFromKafka[0]
516+
Trigger {data: {"message": "hello!"}
517+
: Function.SendMessageToKafka[0]
518+
Executed 'ConsumeMessageFromKafka' (Succeeded, Id=aa8d92a6-2da1-44ff-a033-cb217b9c29541)
519+
520+
```
521+
522+
## Cleanup
523+
Once you're done using the sample, you can spin down your Kubernetes resources by navigating to the `./deploy` directory and running:
524+
```
525+
kubectl delete -f .
526+
```
527+
This will spin down each resource defined by the .yaml files in the deploy directory.

samples/dotnet-azurefunction/SendMessageToKafka.cs

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -9,11 +9,12 @@ namespace dotnet_azurefunction
99
using Microsoft.Extensions.Logging;
1010
using Dapr.AzureFunctions.Extension;
1111
using Newtonsoft.Json.Linq;
12+
using System.Threading.Tasks;
1213

1314
public static class SendMessageToKafka
1415
{
1516
[FunctionName("SendMessageToKafka")]
16-
public static async void Run(
17+
public static async Task Run(
1718
[DaprServiceInvocationTrigger] JObject payload,
1819
[DaprBinding(BindingName = "%KafkaBindingName%", Operation = "create")] IAsyncCollector<object> messages,
1920
ILogger log)
Lines changed: 40 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,40 @@
1+
kind: Secret
2+
apiVersion: v1
3+
metadata:
4+
name: functionapp
5+
namespace: default
6+
data:
7+
AzureWebJobsStorage: <encoded-connection-string>
8+
StateStoreName: <encoded-state-store-name>
9+
KafkaBindingName: <encoded-kafka-binding-name>
10+
11+
---
12+
apiVersion: apps/v1
13+
kind: Deployment
14+
metadata:
15+
name: functionapp
16+
labels:
17+
app: functionapp
18+
spec:
19+
replicas: 1
20+
selector:
21+
matchLabels:
22+
app: functionapp
23+
template:
24+
metadata:
25+
labels:
26+
app: functionapp
27+
annotations:
28+
dapr.io/enabled: "true"
29+
dapr.io/id: "functionapp"
30+
dapr.io/port: "3001"
31+
spec:
32+
containers:
33+
- name: functionapp
34+
image: <your-docker-hub-id>/<your-image-name>
35+
ports:
36+
- containerPort: 3001
37+
imagePullPolicy: Always
38+
envFrom:
39+
- secretRef:
40+
name: functionapp
Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
apiVersion: dapr.io/v1alpha1
2+
kind: Component
3+
metadata:
4+
name: sample-topic
5+
namespace: default
6+
spec:
7+
type: bindings.kafka
8+
metadata:
9+
# Kafka broker connection setting
10+
- name: brokers
11+
value: dapr-kafka.kafka:9092
12+
# consumer configuration: topic and consumer group
13+
- name: topics
14+
value: sample
15+
- name: consumerGroup
16+
value: group1
17+
# publisher configuration: topic
18+
- name: publishTopic
19+
value: sample
20+
- name: authRequired
21+
value: "false"
Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,12 @@
1+
apiVersion: dapr.io/v1alpha1
2+
kind: Component
3+
metadata:
4+
name: messagebus
5+
namespace: default
6+
spec:
7+
type: pubsub.redis
8+
metadata:
9+
- name: redisHost
10+
value: <your-redis-host-name>
11+
- name: redisPassword
12+
value: <your-redis-password>

0 commit comments

Comments
 (0)