Implementing a reporting bridge
Learn how to implement a custom reporting bridge.
Introduction
This tutorial shows you how to create your very own reporting bridge that implements the Ledger reporting protocol. If you are not familiar with the protocol, check out About Reports.
We will be using the prebuilt bridge template in order to make the integration process faster and smoother.
Like always, when building a production-ready service, follow your usual best practices on application design and security.
The code is written using TypeScript, but you can also use JavaScript.
Quick Start
Prerequisites
We will need the following tools in order to develop and run the demo.
This tutorial is an a part of the Reporting Learning Path so you should be familiar with the rest of the content there before proceeding. The following steps assume that you already have a Ledger set up according to How to set up an RTP Ledger .
Ledger instance
We will need a cloud ledger instance to work with.
Node.js and npm
https://nodejs.org/en/download/
Minka CLI tools
https://www.npmjs.com/package/@minka/cli
After installing Node.js and npm, you can install Minka CLI tools by running the following command.
$ npm install -g @minka/cli
Docker
https://docs.docker.com/get-docker/
Creating a project
The quickest way to start working on a new integration is to setup a new project using the Minka CLI:
$ minka bridge init reporting
? Handle: reporting
? Signer: owner
? Signer password for owner [hidden]
Target directory: reporting-bridge-tutorial
This command is going to initialize a new bridge project in current working directory.
If the target directory is not empty some files may be overwritten.
The signer you selected is used to sign requests that bridge sends to ledger.
? Do you want to proceed? Yes
✔ Downloading bridge code... Done
✔ Extracting bridge files... Done
✔ Configuring environment... Done
✔ Installing dependencies... Done
✅ Bridge project created successfully.
A new NodeJS project has been created which implements an API interface
compatible with ledger. Learn more on how to start customizing this solution
to your needs by reading the included README.md file.
To run the created bridge type the following:
> minka bridge start
We have now setup a new bridge project which is a great starting point for building integrations between payment networks. We will use just the event processing capabilities of the service that we created to help us implement the report creation protocol. If you would like to find out how you can use the same starting project to process intents, check out Building a bridge service.
The service we have created defines all API endpoints required to connect to a remote ledger and a mock banking core implementation.
The bridge code is open-sourced and you can modify any part of it in order to adapt it to your own needs.
Bridge project structure
Bridge SDK automatically solves for us scheduling, communication with ledger, data persistence, idempotency, auditability and retries. We only need to implement adapters to perform required operations in our banking core systems.
Most of the complexity related to communication with ledger is handled by @minka/bridge-sdk
and @minka/ledger-sdk
libraries which are provided and maintained by the ledger core team. These libraries are already installed and
configured in our project.
The project file structure is shown below:
|- src
| |- adapters
| | |- credit.adapter.ts
| | |- debit.adapter.ts
| | `- README.md
| |
| |- core-sdk
| | |- account.ts
| | |- errors.ts
| | |- index.ts
| | |- ledger.ts
| | |- README.md
| | `- transaction.ts
| |
| |- .env
| |- config.ts
| `- main.ts
|
|- .gitignore
|- package-lock.json
|- package.json
|- README.md
`- tsconfig.json
The project is very simple because it only contains code which is custom for our integration, everything that is reusable and generic is already included in the libraries provided by the ledger team.
Bridge adapters
The intent processing part of the bridge is located in the adapters
directory that we will not
need in this tutorial. This directory contains custom adapters which map two phase commit
operations to banking systems. We could delete these, but we will leave them as you can
even use the same bridge to handle intent processing.
Adding README.md
files is a general pattern the project follows to make it easier to have most common
answers immediately available together with the code.
Core SDK
Core SDK directory contains a mock implementation of an in-memory banking core. This code is added to the project by default in order to better show how to connect with a banking core system.
You can safely remove this code from the project and replace it with SDKs that connect to your banking systems. It is added only for demonstration purposes.
Config
.env
file is generated automatically by the CLI when setting up the project. It uses default values for
DB connection, signers provided to the CLI when creating the project, etc.
config.ts
file loads and validated the .env
file. Please review this config file and update the
values as needed. Protect private keys and DB passwords using best practices you usually follow
when deploying services within your organization.
.env
file contains sensitive data like keys and passwords and it should never be committed to version control
main.ts
is the entry point of our bridge service. This file bootstraps the whole service and starts it.
Use this file to register additional adapters, routes, and modify any other setup values you may need to change.
The service consists of two main components which are bootstrapped in main.ts
, these are a server
and a processor
.
The server is an Express app which exposes REST APIs, you can configure the server like this:
const server = ServerBuilder.init()
.useDataSource({ ...dataSource, migrate: true })
.useLedger(ledger)
.build()
const options: ServerOptions = {
port: config.PORT,
routePrefix: 'v2',
}
await server.start(options)
You can learn more about the bridge architecture and configuration in the README.md
file of the project.
Running a local bridge
We can run the service we have created by going into the newly created directory and running the following commands:
$ minka bridge start
Local bridge configuration detected:
- handle: reporting
- server: https://ldg-stg.one/api/v2
- ledger: bqrb-example
- registerWithLedger: true
? Run this configuration? Yes
Cloud ledger detected, creating a tunnel...
Created ✔
Bridge is available through following URLs:
- Server URL: http://localhost:3100/v2
- Public URL: https://heavy-friends-chew.loca.lt/v2
Registering reporting with ledger...
Bridge record found, updating it...
Registered ✔
Checking dependencies:
✔ NodeJs 20.17.0
✔ Docker Running
✔ Starting dependencies... Done
✔ Starting main service... Done
> bridge@1.0.0 start
> nodemon ./src/main.ts
[nodemon] 3.1.7
[nodemon] to restart at any time, enter `rs`
[nodemon] watching path(s): *.*
[nodemon] watching extensions: ts,json
[nodemon] starting `ts-node ./src/main.ts`
Bridge is running on port 3100.
Waiting for new requests...
CLI starts a local server and registers it with ledger automatically by creating or updating a bridge record.
Leave the bridge running and continue the tutorial in a new terminal.
Ledger setup
Now we have everything running and we can try to create out first report. Before creating reports, we still need to set a few things up. First of all, we will create a schema for our new report type using the command below. When prompted to enter the schema, use your default editor and input the schema content JSON below.
$ minka schema create
? Handle: test-report
? Record: report
? Add custom data? No
? What would you like to use to enter the schema: My default editor
? Enter schema content: Received
? Signer: owner
? Signer password for owner [hidden]
✅ Schema created successfully:
Schema summary:
---------------------------------------------------------------------------
Handle: test-report
Record: report
Format: json-schema
Schema content:
---------------------------------------------------------------------------
{
"description": "Create Test report",
"properties": {
"custom": {
"additionalProperties": false,
"properties": {
"account": {
"title": "Account",
"type": "string"
}
},
"required": [
"account"
],
"title": "",
"type": "object"
}
},
"required": [
"custom"
],
"title": "Test report",
"type": "object"
}
Luid: $sch.-147wbaXgwsP2mH_X
Now we can create our report record, but our new bridge will still not get
called to create the actual report, we first need to set up our ledger to send
report-created
events to our reporting
bridge, we need to register
an effect
that handles the report-created
signal
by doing the following:
$ minka effect create
? Handle: test-report-created
? Signal: (Use arrow keys)
? Signal: report-created
? Add filter? Yes
? Field class: string
? Field title: report.data.schema
? Field value: test-report
? What do you want to do next? Finish adding fields
? Action schema: bridge
? Bridge handle: reporting
? Add custom data? No
? Signer: owner
? Signer password for owner [hidden]
✅ Effect created successfully:
Effect summary:
---------------------------------------------------------------------------
Handle: test-report-created
Signal: report-created
Filter:
- report.data.schema: test-report
Action:
- schema: bridge
- bridge: reporting
Access rules:
#0
- Action: any
- Signer:
- public: aeblUjUYsPlYsWzWfajZIExO2Dz5NVbX2RRDpiiTwrc=
Status: created
Luid: $eff.-148J5LbNwrx089dk
Handle: owner
Public: aeblUjUYsPlYsWzWfajZIExO2Dz5NVbX2RRDpiiTwrc=
Check that your leger does not have any other reporting bridges active like
BQRB because they may also start processing report-created
events. If you
do, you should either disable this bridge, remove the reports
trait or
edit the effect filter to exclude the test-report
schema for the already
existing bridge.
Report creation
Event handler implementation
Now we can create a report by running:
$ minka report create
? Handle: qFyj1c7swknqt17ez8pLm
? Schema: test-report
? Add custom data? Yes
? Field class: string
? Field title: account
? Field value: 1001001001
? What do you want to do next? Finish adding fields
? Signer: owner
? Signer password for owner [hidden]
✅ Report created successfully:
Report summary:
---------------------------------------------------------------------------
Handle: qFyj1c7swknqt17ez8pLm
Schema: test-report
Custom:
- account: 1001001001
Status: created
Access rules:
#0
- Action: any
- Signer:
- public: aeblUjUYsPlYsWzWfajZIExO2Dz5NVbX2RRDpiiTwrc=
Handle: owner
Public: aeblUjUYsPlYsWzWfajZIExO2Dz5NVbX2RRDpiiTwrc=
This will result in the bridge service receiving an event with the report record.
...
Error: Effect not registered
at EffectsController.execute (effects.controller.ts:37:13)
at ActionMetadata.callMethod (ActionMetadata.ts:252:44)
at RoutingControllers.ts:123:28
at processTicksAndRejections (node:internal/process/task_queues:95:5)
[22:26:32.786] ERROR: [] No matching bindings found for serviceIdentifier: EventHandler
EventHandler - named: test-report-created
You can see that the service is letting us know that we don't have a handler registered to handle the received event so that is what we will do now.
We will do the following changes to the bridge service:
Create src/handlers/test-report-created.handler.ts
:
import { inject, injectable } from "inversify";
import { BRIDGE_KEY_PAIR, EventContext, EventHandler } from "@minka/bridge-sdk";
import { LedgerSdk } from "@minka/ledger-sdk";
import {
BaseLedgerEvent, EventSignal,
LedgerKeyPair,
LedgerReport
} from "@minka/ledger-sdk/types";
@injectable()
export class TestReportCreatedHandler extends EventHandler {
constructor(
protected sdk: LedgerSdk,
@inject(BRIDGE_KEY_PAIR)
protected bridgeKeyPair: LedgerKeyPair,
) {
super()
}
async execute(
context: EventContext<BaseLedgerEvent<LedgerReport>>,
): Promise<void> {
const event = context?.event
console.log(`Received event ${event?.data?.handle}`)
console.log(JSON.stringify(event, null, 2))
}
}
Create src/handlers/index.ts
:
export * from './test-report-created.handler'
Replace the bootstratServer
function in main.ts
with the following:
// ...
import { TestReportCreatedHandler } from "./handlers";
// ...
// Server is exposing REST API endpoints
const bootstrapServer = async () => {
const server = ServerBuilder
.init()
.useEventHandlerClass(TestReportCreatedHandler, 'test-report-created')
.build(container)
await server.start({ port: config.SERVER_PORT })
}
// ...
Now creating a report will result in the following:
...
Received event:
{
"event": {
"data": {
"handle": "evt_pCzDHnhVbHDR7-rCW",
"report": {
"data": {
"access": [
{
"action": "any",
"signer": {
"public": "aeblUjUYsPlYsWzWfajZIExO2Dz5NVbX2RRDpiiTwrc="
}
}
],
"custom": {
"account": "1001001001"
},
"handle": "yj8bF1UFQP9ypqFN49VRX",
"schema": "test-report"
},
"hash": "624e7e2ff2328c9d44e23b5895c411195900cccc51ac45efd854d3b4e50db849",
"luid": "$rep.-14H1gpXZavamo0ak",
"meta": {
"moment": "2025-10-01T20:56:10.366Z",
"owners": [
"aeblUjUYsPlYsWzWfajZIExO2Dz5NVbX2RRDpiiTwrc="
],
"proofs": [
{
"custom": {
"moment": "2025-10-01T20:56:10.229Z",
"status": "created"
},
"digest": "4bb69517b1aa2964bfebbc88bc83e853696f09acc1504df903fc57d0aad20d4b",
"method": "ed25519-v2",
"public": "aeblUjUYsPlYsWzWfajZIExO2Dz5NVbX2RRDpiiTwrc=",
"result": "Jv7IyNFHNRcMjy+Ld5JTknm8UafXRhB1WdxXAKXdPQF0WiAByuOf13zKyJ0zsrOUyrPo6ZTDeC+ELz73/MRNAw=="
},
{
"custom": {
"luid": "$rep.-14H1gpXZavamo0ak",
"moment": "2025-10-01T20:56:10.370Z",
"status": "created"
},
"digest": "557038b507ae9dbf9601dee15ff5decebb999e3e98d7103864d56efc347a72bf",
"method": "ed25519-v2",
"public": "6QIeobmo9D2tfRl+jFQRP+eI25NR77z/ifhuOjCwABA=",
"result": "XbcSwrA3Izg91gO0SuM4sFkJQWU7XWqp1OAw9QmDjsiWSz9zPRJyghO5VH8H592nZ1VvJ5rRw5QSwyn7XC2FAg=="
}
],
"status": "created"
}
},
"signal": "report-created"
},
"hash": "cb4789a91b51de0963efe7f6eaca6b71da60c65859153425e8e2cea73fbec7f6",
"meta": {
"proofs": [
{
"custom": {
"moment": "2025-10-01T20:56:10.399Z"
},
"digest": "f3c4461a05ed4739d4a8a45d6c9ea96656fa0988ef92ccde3aaedc7e88a1307b",
"method": "ed25519-v2",
"public": "6QIeobmo9D2tfRl+jFQRP+eI25NR77z/ifhuOjCwABA=",
"result": "1kco5T50t9cJTAcHNIs/mW3SrnyEYbY1Tt8HhZcRVb4Qwph3ksC3mKk57gosFqwGv7AG9huWfLZWYDXqJ6LfCw=="
}
]
}
}
}
You can see that the entire report record is delivered to the bridge for verification purposes.
We now need to follow the reporting protocol:
- validate the received event
- validate the report record
- send
preparing
proof to Ledger - generate report
- save report to GCS
- send
completed
proof to Ledger - in case of errors send
rejected
proof to Ledger
Event validation
In general the execute
function in AccountAnchorDroppedHandler
would schedule a
job for a processor that would generate the report, but we will not be doing that
here because it's not the focus of the tutorial.
To verify the received event, we will use ledger-sdk
. It is already configured with our ledger
public key so we just need to add the following code to our handler:
// ...
async execute(
context: EventContext<BaseLedgerEvent<LedgerReport>>,
): Promise<void> {
const event = context?.event
console.log(`Received event ${event?.data?.handle}`)
console.log(JSON.stringify(event, null, 2))
await this.validateEvent(event)
}
async validateEvent(event: LedgerRecord<BaseLedgerEvent<LedgerReport>>) {
console.log(`Validating event ${event?.data?.handle}`)
// validate that the event came from ledger
const verificationClient = this.sdk.proofs.ledger()
await verificationClient.verify(event)
// validate that we got the connect event
if (event?.data?.signal !== EventSignal.ReportCreated) {
throw new Error(`Incorrect event signal, expected ${EventSignal.ReportCreated}, got ${event?.data?.signal}`)
}
console.log(`Successfully validated event ${event?.data?.handle}`)
}
// ...
Don't forget to import the types you need from ledger-sdk
.
This will validate that the event we received is signed by our ledger and that it's the correct event.
Report validation
Next we will validate the report record contained in the event. If the previous step was successful and we configured the ledger correctly, the report should follow the schema, but we want to check just in case.
We need to add the following code:
// ...
const TEST_REPORT_SCHEMA = 'test-report'
// ...
async execute(
context: EventContext<BaseLedgerEvent<LedgerReport>>,
): Promise<void> {
// ...
await this.validateEvent(event)
const report = event?.data?.report as ReportRecord
await this.validateReport(report)
}
// ...
async validateReport(report: ReportRecord) {
console.log(`Validating report ${report?.data?.handle}`)
const data = report?.data
// validate that the report has the correct schema
if (data?.schema !== TEST_REPORT_SCHEMA) {
throw new Error(`Incorrect report schema, expected ${TEST_REPORT_SCHEMA}, got ${data?.schema}`)
}
// validate that the account parameter is a string
if (typeof data?.custom?.account !== 'string') {
throw new Error(`Incorrect account parameter ${data?.custom?.account}`)
}
console.log(`Successfully validated report ${data?.handle}`)
}
// ...
Sending preparing
proof
Now we will let Ledger know that we started working on the report. To do that we need to
send a preparing
proof and we will create a helper function to do that.
// ...
import { config } from "../config";
// ...
async execute(
context: EventContext<BaseLedgerEvent<LedgerReport>>,
): Promise<void> {
// ...
await this.validateReport(report)
await this.sendReportProof(report, {status: ReportStatus.Pending})
}
// ...
async sendReportProof(report: ReportRecord, proofCustom: Partial<ReportProofCustom>): Promise<ReportResponse> {
console.log(`Sending proof for report ${report.data.handle}:\n${JSON.stringify(proofCustom, null, 2)}`)
return await this.sdk.report
.from(report)
.hash()
.sign([
{
keyPair: {
format: 'ed25519-raw' as const,
public: config.BRIDGE_PUBLIC_KEY,
secret: config.BRIDGE_SECRET_KEY,
},
custom: {moment: new Date().toISOString(), ...proofCustom},
},
])
.send()
}
Report generation
Next, we will generate the actual report. In a real-world scenario we would use data exported from Ledger, but since that is not the point of this tutorial, we will use the data we have in core-sdk. The report we are creating just exports all transactions for a given account. We are using sync functions here and storing the CSV in memory before saving it, you would probably want to use async functions and stream the data into the file in GCS instead.
// ...
import { stringify } from 'csv-stringify/sync'
import { coreSdk } from "../core-sdk";
// ...
async execute(
context: EventContext<BaseLedgerEvent<LedgerReport>>,
): Promise<void> {
// ...
await this.sendReportProof(report, {status: ReportStatus.Pending})
const reportData = this.generateReportCSV(report.data.custom.account)
}
// ...
generateReportCSV(account: string): string {
console.log(`Generating report for account ${account}`)
const data = coreSdk.transactions.filter(transaction => transaction.account === account)
return stringify(data, {
delimiter: ',',
header: true,
quoted: true,
quote: '"',
})
}
In order for the code above to work, we will also need to do:
$ npm install csv-stringify
added 1 package, and audited 2 packages in 460ms
found 0 vulnerabilities
Saving file to GCS
To save the data to GCS, we will need to prepare a few things. First, let's add two more config options by
adding the following to our config.ts
file:
// ...
REPORTING_PROJECT: str({ desc: 'Reporting project' }),
REPORTING_BUCKET: str({ desc: 'Reporting bucket' }),
We will also add the following values to the end of our .env
file:
REPORTING_PROJECT=minka-ledger-stg
REPORTING_BUCKET=ledger-reports-stg
Your actual values may differ depending on your environment. Now we will also install the GCS NodeJS client:
$ npm install @google-cloud/storage 130 ↵
added 84 packages, and audited 85 packages in 1s
14 packages are looking for funding
run `npm fund` for details
found 0 vulnerabilities
And finally we will add the code to generate the correct filename and save the report:
// ...
import { Storage } from "@google-cloud/storage";
const FILENAME = 'report.csv'
// ...
async execute(
context: EventContext<BaseLedgerEvent<LedgerReport>>,
): Promise<void> {
// ...
const reportData = this.generateReportCSV(report.data.custom.account)
const fileLocation = this.generateReportFilename(report, FILENAME)
const cloudStorageURI = await this.saveFile(fileLocation, reportData)
}
generateReportFilename(
report: ReportRecord,
asset: string,
): string {
const domain = report.meta.domain
const schema = report.data.schema
const luid = report.luid
return (
`ledgers/${config.LEDGER_HANDLE}` +
(domain && domain.length > 0 ? `/domains/${domain}` : '') +
(schema && schema.length > 0 ? `/schemas/${schema}` : '') +
`/reports/${luid}` +
`/assets/${asset}`
)
}
async saveFile(
filename: string,
data: string
): Promise<string> {
console.log(`Saving file ${filename} to GCS`)
const storage = new Storage({projectId: config.REPORTING_PROJECT})
const bucket = storage.bucket(config.REPORTING_BUCKET)
const file = bucket.file(filename)
await file.save(data)
console.log(`Successfully saved ${filename} to GCS`)
return file.cloudStorageURI.toString()
}
In order for the code above to work properly, you will need a gcloud account with the right permissions. You can ask your business representative to help you set it up.
Sending completed
proof
Now we will save the location of the generated asset to Ledger by sending a completed
proof:
// ...
async execute(
context: EventContext<BaseLedgerEvent<LedgerReport>>,
): Promise<void> {
// ...
const cloudStorageURI = await this.saveFile(fileLocation, reportData)
const assets = [{handle: FILENAME, output: cloudStorageURI}]
await this.sendReportProof(report, {
status: ReportStatus.Completed,
assets
})
}
Handling errors and sending rejected
proof
In case of issues while processing, we will fail report creation by sending a rejected
proof
to Ledger. To do that we will wrap the entire processing in a try-catch block. The execute
method will finally look like this:
// ...
async execute(
context: EventContext<BaseLedgerEvent<LedgerReport>>,
): Promise<void> {
const event = context?.event
console.log(`Received event ${event?.data?.handle}`)
console.log(JSON.stringify(event, null, 2))
await this.validateEvent(event)
const report = event?.data?.report as ReportRecord
try {
await this.validateReport(report)
await this.sendReportProof(report, {status: ReportStatus.Pending})
const reportData = this.generateReportCSV(report.data.custom.account)
const fileLocation = this.generateReportFilename(report, FILENAME)
const cloudStorageURI = await this.saveFile(fileLocation, reportData)
const assets = [{handle: FILENAME, output: cloudStorageURI}]
await this.sendReportProof(report, {
status: ReportStatus.Completed,
assets
})
} catch (error) {
console.error(`Error while processing event ${event?.data?.handle}`)
await this.sendReportProof(report, {status: ReportStatus.Rejected})
}
}
Of course it's still possible that saving the rejected
proof also fails which should be handled as well.
This is what we get now after creating a report:
Received event evt_APLE0jVTV-Fabr4Lm
{
...
}
Validating event evt_APLE0jVTV-Fabr4Lm
Successfully validated event evt_APLE0jVTV-Fabr4Lm
Validating report 2N9mpcWYT6EUu366qpCtr
Successfully validated report 2N9mpcWYT6EUu366qpCtr
Sending proof for report 2N9mpcWYT6EUu366qpCtr:
{
"status": "pending"
}
Generating report for account 1001001001
Saving file ledgers/bqrb-example/schemas/test-report/reports/$rep.-16QUFyOiq34F-s9H/assets/report.csv to GCS
Successfully saved ledgers/bqrb-example/schemas/test-report/reports/$rep.-16QUFyOiq34F-s9H/assets/report.csv to GCS
Sending proof for report 2N9mpcWYT6EUu366qpCtr:
{
"status": "completed",
"assets": [
{
"handle": "report.csv",
"output": "gs://ledger-reports-stg/ledgers/bqrb-example/schemas/test-report/reports/$rep.-16QUFyOiq34F-s9H/assets/report.csv"
}
]
}
You can now use the CLI to check out your report:
$ minka report show 2N9mpcWYT6EUu366qpCtr
Report summary:
---------------------------------------------------------------------------
Handle: 2N9mpcWYT6EUu366qpCtr
Schema: test-report
Custom:
- account: 1001001001
Status: completed
Assets:
#0
- Handle: report.csv
- Output: gs://ledger-reports-stg/ledgers/bqrb-example/schemas/test-report/reports/$rep.-16QUFyOiq34F-s9H/assets/report.csv
Access rules:
#0
- Action: any
- Signer:
- public: aeblUjUYsPlYsWzWfajZIExO2Dz5NVbX2RRDpiiTwrc=
You can see that it's completed with the report.csv
asset attached. To download and view the report you can run:
$ minka report download 2N9mpcWYT6EUu366qpCtr report.csv
File report.csv downloaded successfully.
$ cat report.csv
"id","type","account","amount","status","errorReason","errorCode","idempotencyToken"
"0","CREDIT","1001001001","1000","COMPLETED",,,
"1","DEBIT","1001001001","100","COMPLETED",,,
"2","DEBIT","1001001001","200","COMPLETED",,,
Conclusion
The bridge we have built here should give you a good understanding of the reporting protocol and how to build a bridge that follows it. You can learn more in the rest of our documentation.
The code shared here is open-sourced and you can use it freely. Of course this is not a final, production ready solution. After this, you need to adapt the solution to your specific needs, secure it properly and host it.
Additionally, our open-source bridge SDKs and samples give you the tools necessary to build an entire integration, but you will still need to implement many things for this integration to be ready for production.
You will have to run performance tests and configure the solution properly for production. You will also need to securely store secrets in your infrastructure, build observability, monitoring, notifications, and many other things that you always have to do when deploying a new service.
Minka also provides a BigQuery reporting bridge that implements all of the above and which you can use to more easily generate reports. In case you are interested in using it instead of building everything on your own please contact your sales representative to learn more.