Documenting your API
Before we celebrate ourselves we have to tackle one missing point: We have to document our API.
No, it is not! Leaving the issues of proper documented code aside here, we will concentrate on documenting the API. The de facto standard in our days seems to be using Swagger1 for this. To keep things simple we will stick to it. Besides that it won’t hurt to have some documentation in text form (a small file could be enough) which should explain the quirks of our API. The bigger the project the earlier you may encounter flaws in the logic which might not be changeable because whatever reasons there are. ;-)
The lay of the land
Swagger provides a bit of tooling and there are likely lots of projects trying to bring it to your favourite web framework or tool kit. In many cases it might be a good idea to bundle the Swagger UI2 with your service and make it available on a specific path. Depending on your needs and environment you’ll want to protect that path via authentication and make it configurable for turning it off in production.
Having the UI you must decide which path you want to go down:
- Write a
swagger.ymlfile which describes your API, create JSON from it and deliver that as an asset. - Create the JSON dynamically via some library at runtime.
The first point has the benefit that you description will be more or less set in stone and you avoid possible performance impacts and other quirks at runtime. However you must think of a way to test that your service actually fulfils the description (read specification) that you deliver. The most common tool for writing your API description is probably the Swagger Editor3.
Taking the second path will result in a description which reflects your actual code. However none of the tools available I’ve seen so far has fulfilled that promise to 100 percent. You’ll very likely have to make extensive use of annotations to make your API description usable, resulting also in a decoupling of code and description. Also you might encounter “funny” deduced data types like Future1 and so on - which are annoying and confusing for the user. For those favouring the impure approach there is the swagger-akka-http library4.
The answer is yes! We can describe our API using static typing and have the compiler check it and can deduce server and client code from it.
Using types to describe an API
Describing your API using types is not bleeding edge academic research stuff like you might have guessed. There are several libraries existing for it! :-)
Personally I stumbled upon such things first some years ago when seeing the talk “Using object algebras to design embedded DSLs” (Curry On 2016)5. The related project is the library endpoints6. However there are other projects too including the rho library7 included in the http4s project. Another one is tapir8 which we will be using in our example.
In the Haskell camp there is the beautiful Servant library9
First I wanted to use something which allows us to generate a http4s server. This already narrowed down the options a bit. Also it should be able to generate API documentation (which nowadays means Swagger/OpenAPI support). Furthermore it should support not only http4s but more options. So after playing a bit around I decided to use the tapir library.
A pure implementation using tapir.
We basically cloned our pure folder into the tapir folder and start to apply our changes to the already pure implementation. But first some theory.
Basics
The tapir library assumes that you describe your API using the Endpoint type which is more concrete defined as follows: Endpoint[I, E, O, S]
- The type
Idefines the input given into the endpoint. - Type type
Edefines the error (or errors) which may be returned by the endpoint. - The type
Odefines the possible output of the endpoint. - The type
Sspecifies the type of streams which are used for in- and output.
An endpoint can have attributes like name or description which will be used in the generated documentation. You can also map input and output parameters into case classes.
Regarding the encoding and decoding of data we will need our Circe codecs but additionally some schema definitions required by tapir. In concrete we will need to define implicit SchemaFor[T] instances for each of our models. We will start with the Translation model.
1 implicit val schemaFor: SchemaFor[Translation] = SchemaFor(
2 Schema.SProduct(
3 Schema.SObjectInfo("Translation"),
4 List(("lang", Schema.SString), ("name", Schema.SString)),
5 List("lang", "name")
6 )
7 )
As you can see this is quite straightforward and not very complicated. In the future we might be able to derive such things but for now we have to define them. The schema is defined as a product type which is further described by the “object info” type containing a name, the field names and their schemas and at least a list of field names which are required to construct the type.
1 implicit val schemaFor: SchemaFor[Product] = SchemaFor(
2 Schema.SProduct(
3 Schema.SObjectInfo("Product"),
4 List(
5 ("id", Schema.SString),
6 ("names", Schema.SArray(Translation.schemaFor.schema))
7 ),
8 List("id", "names")
9 )
10 )
This is basically the same thing as before except that we rely on the existing schema definition of Translation and use that here. You might notice that we explicitly define our NonEmptySet as an SArray here. This is because we want a list like representation on the JSON side.
Yes, we can! :-) To gain more flexibility we can provide a generic schema for our NonEmptySet type.
1 implicit def schemaForNeS[T](implicit a: SchemaFor[T]):
2 SchemaFor[NonEmptySet[T]] =
3 SchemaFor(Schema.SArray(a.schema))
Here we define that a NonEmptySet will be an array using the schema of whatever type it contains. Now we rewrite our previous schema definition as follows.
1 implicit val schemaFor: SchemaFor[Product] = SchemaFor(
2 Schema.SProduct(
3 Schema.SObjectInfo("Product"),
4 List(
5 ("id", Schema.SString),
6 ("names", schemaForNeS[Translation].schema)
7 ),
8 List("id", "names")
9 )
10 )
It is not necessarily shorter but we have gained some more flexibility and can reuse our schema for the non empty set in several places.
Product routes
Having the basics settled we can try to write our first endpoint. Let’s refactor our product routes. We will define our endpoints in the companion object of the class.
1 // Our type is Endpoint[ProductId, StatusCode, Product, Nothing]
2 val getProduct = endpoint.get
3 .in("product" / path[ProductId]("id"))
4 .errorOut(statusCode)
5 .out(jsonBody[Product])
So what do we have here? First we specify the HTTP method by using the get function of the endpoint. Now we need to define our path and inputs. We do this using the in helper which accepts path fragments separated by slashes and also a path[T] helper which allows us to extract a type directly from a path fragment. This way we define our entry point product/id in which id must match our ProductId type.
To be able to be more flexible about our returned status codes we use the errorOut function which in our case just receives a status code (indicated by passing statusCode to it).
Finally we define that the endpoint will return the JSON representation of a product by using the out and jsonBody helpers. This all is reflected in the actual type signature of our endpoint which reads Endpoint[ProductId, StatusCode, Product, Nothing]. If we remember the basics then we know that this amounts to an endpoint which takes a ProductId as input, produces a StatusCode as possible error and returns a Product upon success.
Our endpoint alone won’t do us any good so we need an actual server side implementation of it. While we could have used the serverLogic function to directly attach our logic onto the endpoint definition this would have nailed us down to a concrete server implementation.
So we’re going to implement it in the ProductRoutes class.
1 val getRoute: HttpRoutes[F] = ProductRoutes.getProduct.toRoutes { id =>
2 for {
3 rows <- repo.loadProduct(id)
4 resp = Product
5 .fromDatabase(rows)
6 .fold(StatusCodes.NotFound.asLeft[Product])(_.asRight[StatusCode])
7 } yield resp
8 }
We use the toRoutes helper of tapir which expects a function with the actual logic. As you can see the implementation is straightforward and only differs slightly from our original one. Currently there is no other way to handle our “not found” case than using the fold at the end. But if you remember, we did the same thing in the original code.
That was not that difficult, for something which some people like to talk about as “academic fantasies from fairy tale land”. ;-)
Onward to our next route: updating an existing product. First we need to define our endpoint.
1 // Our type is Endpoint[(ProductId, Product), StatusCode, Unit, Nothing]
2 val updateProduct =
3 endpoint.put
4 .in("product" / path[ProductId]("id"))
5 .in(
6 jsonBody[Product]
7 .description("The updated product data which should be saved.")
8 )
9 .errorOut(statusCode)
10 .out(statusCode(StatusCodes.NoContent))
This is only slightly more code than our first endpoint. We use the put method this time and the same logic as before to extract our product id from the path. But we also need our product input which we expect as JSON in the request body. The jsonBody function used is also extended with the description helper here which will provide data for a possibly generated documentation. We’ll come to generated API docs later on.
We also restrict our errors to status codes via the errorOut(statusCode) directive. Last but not least we have to define our output. Per default a status code of “200 OK” will be used which is why we override it in the out function with the “204 No Content” preferred by us.
1 private val updateRoute: HttpRoutes[F] =
2 ProductRoutes.updateProduct.toRoutes {
3 case (id, p) =>
4 for {
5 cnt <- repo.updateProduct(p)
6 res = cnt match {
7 case 0 => StatusCodes.NotFound.asLeft[Unit]
8 case _ => ().asRight[StatusCode]
9 }
10 } yield res
11 }
The implementation is again very similar to our original one. Except that it is even a bit simpler. This is because we do not have to worry about wrongly encoded input (Remember our handleErrorWith directive?). The tapir library will by default return a “400 Bad Request” status if any provided input cannot be decoded.
Within the pattern match we use the status codes provided by tapir and map the returned values to the correct type which is an Either[StatusCode, Unit] because of our endpoint type. This results from our endpoint type signature being Endpoint[(ProductId, Product), StatusCode, Unit, Nothing]. This translates to having an input of both a ProductId and a Product and returning a StatusCode in the error case or Unit upon success.
Now we only need to combine both routes and we’re set.
1 @SuppressWarnings(Array("org.wartremover.warts.Any"))
2 val routes: HttpRoutes[F] = getRoute <+> updateRoute
So, let us run our tests and see what happens.
1 [info] when PUT /product/ID
2 [info] when request body is invalid
3 [info] - must return 400 Bad Request *** FAILED ***
4 [info] TestFailedException was thrown during property evaluation.
5 [info] Message: Vector(...) was not empty
6 [info] Location: (ProductRoutesTest.scala:100)
7 [info] Occurred when passed generated values (
8 [info] id = b55df341-a165-40c4-87ba-3d1c5cfb2f0c
9 [info] )
Well, not what we expected, or is it? To be honest I personally expected more errors but maybe I’m just doing this stuff for too long. ;-)
If we look into the error we find that because the encoding problems are now handled for us the response not only contains a status code of “400 Bad Request” but also an error message: “Invalid value for: body”. Because I’m fine with that I just adjust the test and let it be good. :-)
Pretty awesome, we have already half of our endpoints done. So let’s move on to the remaining ones and finally see how to generate documentation and also a client for our API.
Products routes
1 val createProduct: Endpoint[Product, StatusCode, Unit, Nothing] =
2 endpoint.post
3 .in("products")
4 .in(
5 jsonBody[Product]
6 .description("The product data which should be created.")
7 )
8 .errorOut(statusCode)
9 .out(statusCode(StatusCodes.NoContent))
As we can see the endpoint definition for creating a product does not differ from the one that was used to update one. Except that we have a different path here and do not need to extract our ProductId from the URL path.
1 val createRoute: HttpRoutes[F] =
2 ProductsRoutes.createProduct.toRoutes { product =>
3 for {
4 cnt <- repo.saveProduct(product)
5 res = cnt match {
6 case 0 => StatusCodes.InternalServerError.asLeft[Unit]
7 case _ => ().asRight[StatusCode]
8 }
9 } yield res
10 }
The implementation is again pretty simple. In case that the saveProduct function returns a zero we output a “500 Internal Server Error” because the product has not been saved into the database.
Finally we have our streaming endpoint left, so let’s see how we can do this via tapir.
1 // Our type is Endpoint[Unit, StatusCode, Stream[F, Byte], Stream[F, Byte]]
2 def getProducts[F[_]] =
3 endpoint.get
4 .in("products")
5 .errorOut(statusCode)
6 .out(
7 streamBody[Stream[F, Byte]](schemaFor[Byte], tapir.MediaType.Json())
8 )
The first thing we can see is that we use a def instead of a val this time. This is caused by some necessities on the Scala side. If we want to abstract over a type parameter then we need to use a def here.
We also have set the last type parameter not to Nothing but to something concrete this time. This is because we actually want to stream something. ;-)
It is a bit annoying that we have to define it two times (once for the output type and once for the “stream” type). Much nicer would be something like Endpoint[I, E, Byte, Stream[F, _]] but currently this is not the way we can do it.
So we again specify the HTTP method (via get) and the path (which is “products”). The errorOut helper once again restricts our error output to the status code. Finally we set the output of the endpoint by declaring a streaming entity (via streamBody).
But is is sufficient and we also directly specify the returned media type to be JSON.
1 @SuppressWarnings(Array("org.wartremover.warts.Any"))
2 val getRoute: HttpRoutes[F] = ProductsRoutes.getProducts.toRoutes {
3 val prefix = Stream.eval("[".pure[F])
4 val suffix = Stream.eval("]".pure[F])
5 val ps = repo.loadProducts
6 .groupAdjacentBy(_._1)
7 .map {
8 case (id, rows) => Product.fromDatabase(rows.toList)
9 }
10 .collect {
11 case Some(p) => p
12 }
13 .map(_.asJson.noSpaces)
14 .intersperse(",")
15 val result: Stream[F, String] = prefix ++ ps ++ suffix
16 val bytes: Stream[F, Byte] = result.through(fs2.text.utf8Encode)
17 bytes
18 }
Again our implementation is quite the same compared to the original one. Except that in the end we convert our stream of String into a stream of Byte using the utf8Encode helper from the fs2 library.
1 found : fs2.Stream[F,Byte]
2 required: Unit => F[Either[tapir.model.StatusCode,fs2.Stream[F,Byte]]]
3 (which expands to) Unit => F[Either[Int,fs2.Stream[F,Byte]]]
4 bytes
5 ^
Damn, so close. But let’s keep calm and think. Or ask around on the internet. Which is totally fine. Actually it is all in the compiler error message.
1 @SuppressWarnings(Array("org.wartremover.warts.Any"))
2 val getRoute: HttpRoutes[F] = ProductsRoutes.getProducts.toRoutes {
3 // ...
4 val result: Stream[F, String] = prefix ++ ps ++ suffix
5 val bytes: Stream[F, Byte] = result.through(fs2.text.utf8Encode)
6 val response: Either[StatusCode, Stream[F, Byte]] = Right(bytes)
7 (_: Unit) => response.pure[F]
8 }
We first convert our response explicitly into the right side of an Either because the left side is used for the error case. Afterwards we provide the needed Unit => ... function in which we lift our response value via pure into the context of F.
So let’s go crazy and simply combine our routes like in the previous part and run the test via testOnly *.ProductsRoutesTest on the sbt console.
1 [info] All tests passed.
Yes! Very nice, it seems like we are done with implementing our routes via tapir endpoints.
Documentation via OpenAPI
So, we can now look at documenting our API via OpenAPI using the tooling provided by tapir. But first we should actually modify our main application entry point to provide the documentation for us.
1 // ...
2 productRoutes = new ProductRoutes(repo)
3 productsRoutes = new ProductsRoutes(repo)
4 docs = List(
5 ProductRoutes.getProduct,
6 ProductRoutes.updateProduct,
7 ProductsRoutes.getProducts,
8 ProductsRoutes.createProduct
9 ).toOpenAPI("Pure Tapir API", "1.0.0")
10 docsRoutes = new SwaggerHttp4s(docs.toYaml)
11 routes = productRoutes.routes <+> productsRoutes.routes
12 httpApp = Router(
13 "/" -> routes,
14 "/docs" -> docsRoutes.routes
15 ).orNotFound
16 // ...
We use the toOpenAPI helper provided by tapir which generates a class structure describing our API from a list of given endpoints. Additionally we use the SwaggerHttp4s helper which includes the Swagger UI for simple documentation browsing. All of it is made available under the /docs path. So calling http://localhost:57344/docs with your browser should open the UI and the correct documentation.
But while browsing there we can see that it provides our models and endpoints but documentation could be better. So what can we do about it?
The answer is simple: Use the helpers provided by tapir to add additional information to our endpoints.
Providing example data
Besides functions like description or name tapir also provides example which will result in having concrete examples in the documentation. To use this we must construct example values of the needed type. A Product example could look like this.
1 val example = Product(
2 id = java.util.UUID.randomUUID,
3 names = NonEmptySet.one(
4 Translation(
5 lang = "de",
6 name = "Das ist ein Name."
7 )
8 ) ++
9 NonEmptySet.one(
10 Translation(
11 lang = "en",
12 name = "That's a name."
13 )
14 ) ++
15 NonEmptySet.one(
16 Translation(
17 lang = "es",
18 name = "Ese es un nombre."
19 )
20 )
21 )
We can now use it in our product endpoint description.
1 val getProduct = endpoint.get
2 .in(
3 "product" / path[ProductId]("id")
4 .description("The ID of a product which is a UUID.")
5 .example(example.id)
6 )
7 .errorOut(statusCode)
8 .out(
9 jsonBody[Product]
10 .description("The product associated with the given ID.")
11 .example(example)
12 )
13 .description(
14 "Returns the product specified by the ID given in the URL path.
15 If the product does not exist then a HTTP 404 error is returned."
16 )
As you can see we make use of description and example here. Also the path parameter id is described that way.
1 val updateProduct =
2 endpoint.put
3 .in(
4 "product" / path[ProductId]("id")
5 .description("The ID of a product which is a UUID.")
6 .example(example.id)
7 )
8 .in(
9 jsonBody[Product]
10 .description("The updated product data which should be saved.")
11 .example(example)
12 )
13 .errorOut(statusCode)
14 .out(
15 statusCode(StatusCodes.NoContent)
16 .description("Upon successful product update no content is returned.")
17 )
18 .description(
19 "Updates the product specified by the ID given in the URL path.
20 The product data has to be passed encoded as JSON in the request body.
21 If the product does not exist then a HTTP 404 error is returned."
22 )
Here we also add a description to the simple status code output explaining explicitly that no content will be returned upon success. While the 204 status code should be enough to say this you can never be sure enough. ;-)
We’ll skip the create endpoint because it looks nearly the same as the update endpoint. Instead let’s take a look at our streaming endpoint.
1 def getProducts[F[_]] =
2 endpoint.get
3 .in("products")
4 .errorOut(statusCode)
5 .out(
6 streamBody[Stream[F, Byte]](schemaFor[Byte], tapir.MediaType.Json())
7 .example(examples.toList.asJson.spaces2)
8 )
9 .description(
10 "Return all existing products in JSON format as a stream of bytes."
11 )
This time we need to provide our example as a string because of the nature (read type) of our endpoint. We use the non empty list of examples that we created (you can look it up in ProductsRoutes.scala) and convert it into a JSON string.
If we now visit our swagger endpoint we’ll see nice examples included in the documentation. Pretty cool, especially because many people will look at the examples not at the specification. This might be because (too) many of us have seen an API not fulfilling its specification. This shouldn’t happen in our case because we’re deriving it, yeah! But nonetheless examples are very nice to have. :-)
If we take a closer look at our model descriptions then we might see that we could do better in some cases. I’m thinking of our ID fields being simple strings instead of UUIDs and the language code which is also defined as a simple string. So let’s get going and clean that up!
Refining the generated documentation
Looking at the intermediate model for our API documentation (see the OpenAPI class structure in the tapir library) we realise that modifying such a deeply nested case class structure might result in some really messy code. I mean we have probably all been there at some point in our life as developer. ;-)
Yes, we can! Confronted with big and nested structures we should pick a tool from our functional programming toolbox which is called optics10.
Don’t be scared by the name or all that mathematics, there exist some usable libraries for it. In our case we will pick Monocle11 which provides profunctor optics for Scala. The basic idea of optics is to provide pure functional abstractions for the manipulation of immutable objects. Because of their pure nature they are composable which results in code which is more flexible and can more easily be reasoned about.
Now that we have that cleared let’s make a plan what we actually want to do.
- Adjust the URL parameter descriptions of
{id}to mark them as kind ofUUID. - Adjust the
idattribute of ourProductmodel to mark it as kind ofUUID. - Adjust the
langattribute of ourTranslationmodel to mark it as an ISO-639-1 language code.
If we take a look at the code within the tapir library, we see that the Schema and SchemaFor code which is used for codecs does not yet support a dedicated UUID type. There is a detour for it using Schema.SString.
Now we look a bit further into the OpenAPI code and find that it supports several interesting attributes which we might use. For now we will stick to the attribute pattern of the Schema class in that part. It is intended to hold a pattern (read regular expression) which describes the format of a string type.
Okay, so we need to define some (or better exactly 2) regular expressions. But hey wait, we already have one for our language code! :-)
Well, using some shapeless12 magic we might use an implicit Witness which should be provided by our refined type.
1 def extractRegEx[S <: String](implicit ws: Witness.Aux[S]): String =
2 ws.value
The code above is a rough idea so don’t count on it. We’ll see later on if I was right or did suffer from the hallucination of actually understanding what I am doing. ;-)
In the hope that this is settled we need one additional regular expression for a UUID. These are defined in RFC-412213 and ignoring the special edge case of a “NIL UUID” we come up with the following solution.
1 ^[0-9a-f]{8}-[0-9a-f]{4}-[1-5][0-9a-f]{3}-[089ab][0-9a-f]{3}-[0-9a-f]{12}$
According to the OpenAPI specification we are nailed down to Javascript regular expressions (read the ECMA 262 regular expression dialect14). Oh why cruel fate? Well, let us deal with that when we have our other puzzle pieces in order.
Before we dive right in let’s play around a bit to get used to this fancy optics thing.
1 val docs: OpenAPI = ???
2 val paths: Lens[OpenAPI, ListMap[String, PathItem]] =
3 GenLens[OpenAPI](_.paths)
4 val test = (paths composeLens at("/product/{id}")).get(d)
We defined our first Lens via the GenLens macro. It is supposed to give us the path definitions from an OpenAPI object. In the last part we use the compose functionality to query a specific item from the paths which we address via a string because it is a ListMap using string keys.
1 [error] ...scala: ambiguous implicit values:
2 [error] both method atMap in object At of type
3 [K, V]=> monocle.function.At[Map[K,V],K,Option[V]]
4 [error] and method atSet in object At of type
5 [A]=> monocle.function.At[Set[A],A,Boolean]
6 [error] match expected type monocle.function.At[S,String,A]
7 [error] val x = (paths composeLens at("/product/{id}")).get(d)
8 [error] ^
9 [error] one error found
Oh no, the compiler yells at us! Some investigation leads to the conclusion that there is a type class instance missing for ListMap. So we build one. Luckily for us this is practically identical to the one for Map.
1 implicit def atListMap[K, V]: At[ListMap[K, V], K, Option[V]] = At(
2 i => Lens((_: ListMap[K, V]).get(i))(optV => map =>
3 optV.fold(map - i)(v => map + (i -> v))
4 )
5 )
And now our code compiles, hooray! :-)
While we’re at it we also provide an instance for Index on ListMap.
1 implicit def listMapIndex[K, V]: Index[ListMap[K, V], K, V] = Index.fromAt
Okay, back to work. Thinking a bit about our problem we realise that we need a couple of lenses which we can compose to modify the needed parts of the documentation structure.
1 val paths: Lens[OpenAPI, ListMap[String, PathItem]] =
2 GenLens[OpenAPI](_.paths)
3 val getOps: Lens[PathItem, Option[Operation]] =
4 GenLens[PathItem](_.get)
5 val putOps: Lens[PathItem, Option[Operation]] =
6 GenLens[PathItem](_.put)
7 val operationParams: Lens[Operation, List[OpenAPI.ReferenceOr[Parameter]]] =
8 GenLens[Operation](_.parameters)
9 val pathParams: Lens[PathItem, List[OpenAPI.ReferenceOr[Parameter]]] =
10 GenLens[PathItem](_.parameters)
11 val parameterSchema: Lens[Parameter, OpenAPI.ReferenceOr[Schema]] =
12 GenLens[Parameter](_.schema)
13 val schemaPattern: Lens[Schema, Option[String]] =
14 GenLens[Schema](_.pattern)
Well this is quite a lot but let’s break it apart piece by piece. In general we use the GenLens macro of Monocle to create the lenses for us.
First we create a lens which returns the defined path items which is a ListMap. We later use the at function of the type class to grab a concrete entry from it. This “concrete” entry will be a PathItem that contains more information. Next are some lenses which will return an Operation from the aforementioned PathItem. Depending on the type of the operation (GET, POST, etc.) we return the appropriate entry.
Now we need to grab the parameters used for the endpoints which can either be collected directly from a PathItem or an Operation. These are both lists of the type Parameter. Okay, I’m lying straight to your face here. In fact they are ReferenceOr[Parameter] which means an Either[Reference, Parameter]. But in our use case we only have them as parameters so we’ll ignore this for now.
Last but not least we need to grab the Schema of a parameter and from that one the pattern field. The schema describing the parameter is also an Either[Reference, Schema] which we will ignore too in this case. Now we can play around with our lenses and various combinations.
1 // Delete the entry at "/product/{id}"
2 (paths composeLens at("/product/{id}")).set(None)(docs)
3 // Replace the path parameters with an empty list at "/product/{id}"
4 (paths composeLens at("/product/{id}") composeOptional possible
5 composeLens pathParams)
6 .set(List.empty)(docs)
7 // Traverse through all schemas in all path parameters at "/product/{id}"
8 (paths composeLens at("/product/{id}") composeOptional possible
9 composeLens pathParams composeTraversal each composeOptional
10 possible composeLens parameterSchema)
11 .getAll(docs)
12 // Set the pattern field in all schemas in all path parameters
13 // at "/product/{id}"
14 (paths composeLens at("/product/{id}") composeOptional possible
15 composeLens pathParams composeTraversal each composeOptional
16 possible composeLens parameterSchema composeOptional possible
17 composeLens schemaPattern)
18 .set(Option("Optics are soo cool!"))(docs)
This can be confusing at a first look but it is actually very powerful and clean for modifying deeply nested structures. The current API is a bit verbose but there is hope.
So what we can take away from this is that we can update our pattern field within all parameters using the following code. Before you ask: We can update all parameters because we have only one. ;-)
1 (paths composeLens at("/product/{id}") composeOptional possible
2 composeLens pathParams composeTraversal each composeOptional
3 possible composeLens parameterSchema composeOptional possible
4 composeLens schemaPattern)
5 .set(Option("Fancy UUID regex here!"))(docs)
Seems we can (as good as) check off the first point on our list. Leaving us with modifying the pattern field of the actual model schemas. To get to these we have to define some more lenses.
1 val components: Lens[OpenAPI, Option[Components]] =
2 GenLens[OpenAPI](_.components)
3 // type Lens[Components, ListMap[String, OpenAPI.ReferenceOr[Schema]]]
4 val componentsSchemas =
5 GenLens[Components](_.schemas)
6 // type Lens[Schema, ListMap[String, OpenAPI.ReferenceOr[Schema]]]
7 val schemaProperties =
8 GenLens[Schema](_.properties)
The power of lenses allows us to traverse all of our structures and modify all affected models at once. So we will use the functionality provided by the Each type class here. Let’s try something like the following code.
1 (paths composeTraversal each composeLens getOps).getAll(docs)
Looks good so far but it results in a compiler error.
1 [error] ...: diverging implicit expansion for type cats.kernel.Order[A]
2 [error] starting with method catsKernelStdOrderForSortedSet in trait
3 LowPrioritySortedSetInstancesBinCompat1
4 [error] (paths composeTraversal each composeLens getOps).getAll(docs)
5 [error] ^
This looks like an error from a binary incompatible cats version. But hey, this time the compiler is lying to us. Maybe you have guessed it already: we’re missing another type class instance. This time the one for Each which we need to traverse our ListMap structure. So let’s write one! :-)
1 implicit def listMapTraversal[K, V]: Traversal[ListMap[K, V], V] =
2 new Traversal[ListMap[K, V], V] {
3 def modifyF[F[_]: Applicative](f: V => F[V])(s: ListMap[K, V]):
4 F[ListMap[K, V]] =
5 s.foldLeft(Applicative[F].pure(ListMap.empty[K, V])) {
6 case (acc, (k, v)) =>
7 Applicative[F].map2(f(v), acc)((head, tail) =>
8 tail + (k -> head))
9 }
10 }
11
12 implicit def listMapEach[K, V]: Each[ListMap[K, V], V] =
13 Each(listMapTraversal)
As noted above these things will hopefully be in the next Monocle release together with a much nicer API. :-D
The information we need to modify is within the components field of the generated documentation. While we could traverse all the stuff (paths, operations) these only hold references which are of no use for us. So let’s update our Product model.
1 (components composeOptional possible composeLens componentsSchemas
2 composeLens at("Product") composeOptional possible composeOptional
3 possible composeLens schemaProperties composeLens at("id")
4 composeOptional possible composeOptional possible composeLens
5 schemaPattern)
6 .set(Option("Our UUID regex here!"))(docs)
This is quite a lot but we actually only instruct our optics how to traverse down the structure. In general we need to take care of our possible field types here. Due to the nature of the generated structure having a lot of Option and Either fields we need way more boilerplate here. But as mentioned it is really not that complicated. We compose our lenses via composeLens but may need things like composeOptional possible to compose on a defined Option[T] which we have sometimes to duplicate when having nested occurrences of these. The same instructions can be used to zoom in on the right side of an Either.
1 (components composeOptional possible composeLens componentsSchemas
2 composeLens at("Translation") composeOptional possible composeOptional
3 possible composeLens schemaProperties composeLens at("lang")
4 composeOptional possible composeOptional possible composeLens
5 schemaPattern)
6 .set(Option("Our language code regex here!"))(docs)
The modification for the Translation model looks quite the same so we could possibly make a function out of it which would get some parameters and return the updated structure.
Oh yeah, right we thought about extracting the regular expression directly from the refined type to avoid code duplication. So let’s try to use the function defined earlier on like this val langRegex = extractRegEx[LanguageCode].
1 [error] : type arguments [com.wegtam.books.pfhais.tapir.models.LanguageCode]
2 do not conform to method extractRegEx's type parameter bounds [S <: String]
3 [error] val langRegex = extractRegEx[LanguageCode]
4 [error] ^
Okay, seems like I really don’t understand what I’m doing. ;-) Luckily for us the Scala community has nice, smart and helpful people in it. The solution is to write a type class which will support us extracting the desired parameter.
1 import eu.timepit.refined.api._
2 import eu.timepit.refined.string._
3 import shapeless.Witness
4
5 trait RefinedExtract[T] {
6 def regex: String
7 }
8
9 object RefinedExtract {
10 implicit def instance[T, S <: String](
11 implicit ev: String Refined MatchesRegex[S] =:= T,
12 ws: Witness.Aux[S]
13 ): RefinedExtract[T] = new RefinedExtract[T] { val regex = ws.value }
14 }
This allows us to have the desired effect in just this small piece of code.
1 val typeRegex = implicitly[RefinedExtract[LanguageCode]].regex
2 // convert to Javascript regular expression
3 val langRegex = "/" + typeRegex + "/"
Now if we take a look at our API documentation it looks better. Although we can see that the URL parameter pattern information is not used. But as mentioned before there is a lot of work going on at the tapir site currently and this will also get fixed then.
However don’t forget about optics because their usage goes far beyond what we have done here.