There may be cases where the inference runtimes offered out-of-the-box by MLServer may not be enough, or where you may need extra custom functionality which is not included in MLServer (e.g. custom codecs). To cover these cases, MLServer lets you create custom runtimes very easily.
To learn more about how you can write custom runtimes with MLServer, check out the Custom Runtimes user guide. Alternatively, you can also see this end-to-end example which walks through the process of writing a custom runtime.