Skip to main content

Logger: Create RESTful Data Services

Although LOC data processes are not built and run exactly like microservices (containered applications), they are versatile enough for creating similar functionalities, and can be deployed or revisied quickly thanks to their serverless nature.

Here we will learn how to create two simple RESTful services that can read or wrote user logs in a database.

Learning Objective
  1. To create two data process that can read and write a database table with RESTful-like input and output.

  2. To create reusable logic modules and to control their behavior using logic variables.

  3. To create an agent configuration for

  4. To create multiple API routes as the RESTful API endpoints.


Data Service Design

Database Table Schema

For this tutorial, we will create a simple table Log in the database (for which we will use Microsoft SQL Server as the example) with the following fields:

FieldType
IDINT IDENTITY NOT NULL
MessageText
TimestampDATETIME
tip

IDENTITY in Transact-SQL is equivalent to AUTO_INCREMENT in other SQL variations.

Create or Drop Log Table in MS SQL Server

Create table

Transact-SQL
CREATE TABLE dbo.Log (
ID INT IDENTITY NOT NULL,
Message TEXT,
Timestamp DATETIME
);
GO

The table name is dbo.Log since dbo is the default schema for all tables in SQL Server.

Delete all rows in table

Transact-SQL
DELETE FROM dbo.Log;
GO

Be noted that delete the table content will not reset the auto increment value of ID.

Drop Table

Transact-SQL
DROP TABLE dbo.Log;
GO

To reset the auto increment value of ID, drop (remove) the table then create it again.

API Endpoints

We will create two RESTful-like APIs to read and write the Log table:

MethodAPI pathDescriptionInput
POST/api/data-service/v1/logsWrite logsJSON payload
GET/api/data-service/v1/logs?limit={n}Query logs with maximum limit of nQuerystring

Input

Payload (request body)
{
"logs": ["Life, Universe and Everything", "42", "Don't Panic"]
}

The POST data service will throw an error or exception if no logs are provided in the payload.

Response

Task result (response body)
{
"status": "ok",
"statusCode": 201,
"timestamp": "2024-01-01T12:00:00.000000000Z",
"result": {
"affectedRowCount": 3
}
}

Response On Error

Task result (response body)
{
"status": "error",
"statusCode": 500,
"timestamp": "2024-01-01T18:00:00.000000000Z",
"result": {
"error": {
"name": "error name",
"message": "error message",
"stack": "error stack"
},
"task": {
"executionId": "execution id",
"taskId": "task id",
"dataProcessId": "data process id",
"logicId": "logic id where the error has occurred"
}
}
}

Trigger and Data Process

We will create two API routes and two data processes to implement each of the APIs:

Although both APIs share the same URL, the HTTP method is different hence they will be handled by separated API routes in LOC.

Logic

In this tutorial, each data process will have 3 logic. Further more, the first and the last logic will be the same, which means we only have to create 4 logic instead of 6.

LogicTypeNamePurpose
#1GenericHTTP Payload ParserParse payload and Query String to JSON and write everything into the session storage.
#2a (POST Log)GenericPost LogConnect to the database, submit an action query to insert logs, then write the number of affected rows into session storage.
#2b (GET Log)GenericGet LogParse the name field from payload as a Person data structure and write it into the session storage.
#3AggregatorResult AggregatorFinalise a task result with anything stored as result in the session storage.

Logic "Contracts"

For HTTP Payload Parser and Result Aggregator to be reusable logic modules, they must provide an "interface" of predetermined input and output:

As long as the POST Log and GET Log logic comsume data from the session storage values body and/or params and write results into result and statusCode, their implementation can be modified without affecting HTTP Payload Parser or Result Aggregator.

The session storage values are designed as follows:

Session Storage ValueTypeDescription
bodyJSONParsed data from the HTTP payload body
paramsJSONParsed data from the HTTP payload querystring
resultJSONData to be outputed into the task result
statusCodeJSON (as an integer number)HTTP status code to be set with the task result

Create Agent Configuration

See: Manage Configuration

Create the following agent configuration for the database server: (as mentioned above, we'll use a Microsoft SQL Server as example:)

FieldValue
Agent Configuration NameMS SQL Server
Agent Configuration TypeDatabase Agent
Database TypeMSSQL
Host(Your SQL server URL)
Port1433
Username(Your SQL server username)
Password(Your SQL server password)
Database Name(Your SQL server database name)

Create and Build Logic

See: Create an Entry File and Build a Logic From an Entry File

Generic Logic: HTTP Payload Parser

The HTTP Payload Parser is designed with the following purposes:

  1. Read the trigger payload.
  2. Throw an error or exception if the payload is not a HTTP payload or does not contain one.
  3. Read and log the payload body and querystring.
  4. Parse the body and querystring to JSON objects (null if no data or failed to parse; an error log will be written but no error or exception will be thrown).
  5. Log the JSON objects and write them into the session storage as body and params respectively.
http-payload-parser.js
import { LoggingAgent, SessionStorageAgent } from "@fstnetwork/loc-logic-sdk";

/** @param {import('@fstnetwork/loc-logic-sdk').GenericContext} ctx */
export async function run(ctx) {
const payload = await ctx.payload();

if (!("http" in payload))
throw new Error("this logic only accepts http payload");

const data = payload.http.request.data;
const decodedData = data
? new TextDecoder().decode(new Uint8Array(data))
: "";
const query = payload.http.request.query;

LoggingAgent.info(`body: ${decodedData}`);
LoggingAgent.info(`querystring: ${query}`);

let body = null;
try {
if (data) body = JSON.parse(decodedData);
} catch (e) {
LoggingAgent.error(`error on parsing payload to JSON: ${e.message}`);
}
const params = query
? Object.fromEntries(new URLSearchParams(query))
: null;

LoggingAgent.info({ body: body });
LoggingAgent.info({ params: params });

await SessionStorageAgent.putJson("body", body);
await SessionStorageAgent.putJson("params", params);
}

/**
* @param {import('@fstnetwork/loc-logic-sdk').GenericContext} ctx
* @param {import('@fstnetwork/loc-logic-sdk').RailwayError} error
*/
export async function handleError(ctx, error) {
LoggingAgent.error({
errorMessage: error.message,
stack: error.stack,
task: ctx.task.taskKey,
});
}

Generic Logic: POST Log

post-log.js
import { DatabaseAgent, LoggingAgent, SessionStorageAgent } from "@fstnetwork/loc-logic-sdk";

/** @param {import('@fstnetwork/loc-logic-sdk').GenericContext} ctx */
export async function run(ctx) {
const body = await SessionStorageAgent.get("body");
const logs = body?.logs;

if (!body || !logs) {
await SessionStorageAgent.putJson("statusCode", 400);
throw new Error("the session value 'body' does not exist or contain no logs");
}

let placeholders = [];
let sqlParams = [];
const now = new Date().toISOString();
logs.foreach((log, idx) => {
placeholders.push(`((@P${idx * 2 + 1}), (@P${idx * 2 + 2}))`);
sqlParams.push(log);
sqlParams.push(now);
});
const statement = `INSERT INTO dbo.Log (Message, Timestamp) VALUES ${placeholders.join(
", ",
)};`
LoggingAgent.info(`SQL statement: ${statement}`);
LoggingAgent.info({ sqlParams: sqlParams });

const dbClient = await DatabaseAgent.acquire("log-db-config-ref");
let err = null;
let statusCode = 201;
try {
await dbClient.execute(statement, sqlParams);

const result = {
affectedRowCount: log.length;
};

LoggingAgent.info({ result: result });
await SessionStorageAgent.putJson("result", result)
} catch (e) {
err = e;
statusCode = 500;
} finally {
await dbClient.release();

LoggingAgent.info(`statusCode: ${statement}`);
await SessionStorageAgent.putJson("statusCode", statusCode);

if (err) throw new Error(`database error: ${err.message}`);
}
}

/**
* @param {import('@fstnetwork/loc-logic-sdk').GenericContext} ctx
* @param {import('@fstnetwork/loc-logic-sdk').RailwayError} error
*/
export async function handleError(ctx, error) {
LoggingAgent.error({
errorMessage: error.message,
stack: error.stack,
task: ctx.task.taskKey,
});
}

Generic Logic: GET Log

get-log.js
import { LoggingAgent, SessionStorageAgent } from "@fstnetwork/loc-logic-sdk";

/** @param {import('@fstnetwork/loc-logic-sdk').GenericContext} ctx */
export async function run(ctx) {
const params = await SessionStorageAgent.get("params");
let limit = 1000;
try {
limit = Number(params?.limit);
} finally {
}

const statement =
"SELECT TOP (@P1) * FROM dbo.Log ORDER BY Timestamp DESC, ID DESC;";
LoggingAgent.info(`SQL statement: ${statement}`);
LoggingAgent.info({ sqlParams: [limit] });

const dbClient = await DatabaseAgent.acquire("log-db-config-ref");
let err = null;
let statusCode = 200;
try {
const dbResult = await dbClient.query(statement, [limit]);

const logs = dbResult?.rows || [];
const result = {
logCount: logs.length,
logs: logs.map((log) => {
return {
id: Number(log.ID),
message: String(log.Message),
timestamp: String(log.Timestamp),
};
}),
};

LoggingAgent.info({ result: result });
await SessionStorageAgent.putJson("result", result);
} catch (e) {
err = e;
statusCode = 500;
} finally {
await dbClient.release();

LoggingAgent.info(`statusCode: ${statusCode}`);
await SessionStorageAgent.putJson("statusCode", statusCode);

if (err) throw new Error(`database error: ${err.message}`);
}
}

/**
* @param {import('@fstnetwork/loc-logic-sdk').GenericContext} ctx
* @param {import('@fstnetwork/loc-logic-sdk').RailwayError} error
*/
export async function handleError(ctx, error) {
LoggingAgent.error({
errorMessage: error.message,
stack: error.stack,
task: ctx.task.taskKey,
});
}

Aggregator Logic: Result Aggregator

The result aggregator is responsible for collecting whatever result from previous logic, or handle the railway error if there is one.

result-aggregator.js
import { LoggingAgent, SessionStorageAgent } from "@fstnetwork/loc-logic-sdk";

/** @param {import('@fstnetwork/loc-logic-sdk').GenericContext} ctx */
export async function run(ctx) {
const result = await SessionStorageAgent.get("result");
let statusCode = await SessionStorageAgent.get("statusCode");

if (!statusCode || typeof statusCode != "number") statusCode = 200;

LoggingAgent.info({ result: result });
LoggingAgent.info(`statusCode: ${statusCode}`);

ResultAgent.finalize({
status: "ok",
statusCode: statusCode,
timestamp: new Date().toISOString(),
result: result,
}).httpStatusCode(statusCode);
}

/**
* @param {import('@fstnetwork/loc-logic-sdk').GenericContext} ctx
* @param {import('@fstnetwork/loc-logic-sdk').RailwayError} error
*/
export async function handleError(ctx, error) {
let statusCode = await SessionStorageAgent.get("statusCode");

if (!statusCode || typeof statusCode != "number") statusCode = 500;

LoggingAgent.info(`statusCode: ${statusCode}`);

ResultAgent.finalize({
status: "error",
statusCode: statusCode,
timestamp: new Date().toISOString(),
result: {
error: {
message: error.message,
stack: error.stack,
},
task: {
executionId: ctx.task.taskKey.executionId,
taskId: ctx.task.taskKey.taskId,
dataProcessId: ctx.task.dataProcess.permanentIdentity,
logicId: error.logicPermanentIdentity,
},
},
}).httpStatusCode(statusCode);
}

Create Data Processes

See: Create Data Process

Create the two data processes POST Log Service and GET Log Service with the four logic we've created and built:

POST Log ServiceGET Log Service
Generic #1HTTP Payload ParserHTTP Payload Parser
Generic #2POST LogGET LogAgent configuration reference: log-db-config-ref
AggregatorResult AggregatorResult Aggregator

Add Configuration to Logic

See: Add Configuration to Data Process

While creating or editing the data processes above, add the database agent configuration we've created to POST Log and GET Log with the reference name log-db-config-ref.


Create and Invoke API Routes

See: API Route

Create two API routes for each data process respectively (Encapsulation must set to False in both cases):

POST Log

FieldValue
API Route NamePOST Log API
HTTP MethodPOST
URL/api/data-service/v1/logs
Request ModeSync
Response Content TypeJSON
EncapsulationFalse
API TasksData Process POST Log Service (latest revision)

API invoke example:

curl -X POST -d '{
"logs": ["Life, Universe and Everything", "42", "Don't Panic"]
}' 'https://{LOC server}/api/data-service/v1/logs'

GET Log

FieldValue
API Route NameGET Log API
HTTP MethodGET
URL/api/data-service/v1/logs
Request ModeSync
Response Content TypeJSON
EncapsulationFalse
API TasksData Process GET Log Service (latest revision)

API invoke example:

curl -X GET 'https://{LOC server}/api/data-service/v1/logs?limit=1000'

With everything in place, you've successfully created two RESTful APIs that can read or write a database table.

Creating Additional APIs and Functionalities

Following this tutorial, you can create other data services in the same manner to support additional methods - for example, a PATCH service for updating logs and a DELETE service for removing them.

You can also add functionalities to the data services, for example, user authentication using either request headers or query string parameters, and additional data validation for the values in payload or querystring.