Question: how can I split swagger definition across files? What are the possibilities in that area? The question details are described below:
I do have experience in RAML and what I do is, for example:
/settings:
description: |
This resource defines application & components configuration
get:
is: [ includingCustomHeaders ]
description: |
Fetch entire configuration
responses:
200:
body:
example: !include samples/settings.json
schema: !include schemas/settings.json
The last two lines are important here - theones with !include <filepath>
- in RAML I can split my entire contract into many files that just get included dynamically by the RAML parser (and RAML parser is used by all tools that base on RAML).
My benefit from this is that:
As far as I read, swagger supports $ref
keyword which allows to load external files. But is that files fetched through HTTP/AJAX or can they just be local files?
And is that supported by the whole specification or is it just some tools that support it and some that don't?
What I found here is that the input for swagger has to be one file. And this is extremely inconvenient for big projects:
Or, in other words, can I achieve the same with swagger, that I can with RAML - in terms of splitting files?
operationId is an optional unique string used to identify an operation. If provided, these IDs must be unique among all operations described in your API. /users: operationId: getUsers.
The specification allows for references in multiple locations but not everywhere. These references are resolved depending on where the specification is being hosted--and what you're trying to do.
For something like rendering a dynamic user interface, then yes you do need to eventually load the entire definition into "a single object" which may be composed from many files. If performing a code generation, the definitions may be loaded directly from the file system. But ultimately there are swagger parsers doing the resolution, which is much more fine grained and controllable in Swagger than other definition formats.
In your case, you would use a JSON pointer to the schema reference:
responses:
200:
description: the response
schema:
via local reference
$ref: '#/definitions/myModel'
via absolute reference:
$ref: 'http://path/to/your/resource'
via relative reference, which would be 'relative to where this doc is loaded'
$ref: 'resource.json#/myModel
via inline definition
type: object
properties:
id:
type: string
When I split OpenAPI V3 files using references, I try to avoid the sock drawer anti-pattern and instead use functional groupings for the YAML files.
I also make it so that each YAML file itself is a valid OpenAPI V3 spec.
I start out with the openapi.yaml
file.
openapi: 3.0.3
info:
title: MyAPI
description: |
This is the public API for my stuff.
version: "3"
tags:
# NOTE: the name is needed as the info block uses `title` rather than name
- name: Authentication
$ref: 'authn.yaml#/info'
paths:
# NOTE: here are the references to the other OpenAPI files
# from the path. Note because OpenAPI requires paths to
# start with `/` and that is already used as a separator
# replace the `/` with `%2F` for the path reference.
'/authn/start':
$ref: 'authn.yaml#/paths/%2Fstart'
Then in the functional group:
openapi: 3.0.3
info:
title: Authentication
description: |
This is the authentication module.
version: "3"
paths:
# NOTE: don't include the `/authn` prefix here that top level grouping is
# in the `openapi.yaml` file.
'/start':
get:
responses:
"200":
description: OK
By doing this separation you can independently test each file or the whole API as a group.
There may be points where you repeat yourself, but by doing this you limit the chance of breaking changes to other API endpoints when using a "common" library.
However, you should still have a common definition library for some things such as:
There is a limitation on this approach and that's the "Discriminators" (it may be a ReDoc issue though, but if you had types that have discriminators outside of the openapi.yaml
ReDoc fails to render correctly.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With