# Why ServiceStack Source: https://docs.servicestack.net/why-servicestack Developed in the modern age, ServiceStack provides an alternate, cleaner POCO-driven way of creating web services. ### Features Overview ServiceStack is a simple, fast, versatile and highly-productive full-featured [Web](https://razor.netcore.io) and [Web Services](/web-services) Framework that's thoughtfully-architected to [reduce artificial complexity](/autoquery/why-not-odata#why-not-complexity) and promote [remote services best-practices](/advantages-of-message-based-web-services) with a [message-based design](/what-is-a-message-based-web-service) that allows for maximum re-use that can leverage an integrated [Service Gateway](/service-gateway) for the creation of loosely-coupled [Modularized Service](/modularizing-services) Architectures. ServiceStack Services are consumable via an array of built-in fast data formats (inc. [JSON](https://github.com/ServiceStack/ServiceStack/tree/main/ServiceStack.Text/src/ServiceStack.Text), XML, [CSV](/csv-format), [JSONL](/jsonl-format), [JSV](/jsv-format), [ProtoBuf](/protobuf-format) and [MsgPack](/messagepack-format)) as well as XSD/WSDL for [SOAP endpoints](/soap-support) and [Rabbit MQ](/rabbit-mq), [Redis MQ](/redis-mq), [Azure Service Bus](/azure-service-bus-mq), [Amazon SQS](/aws#sqsmqserver) and [Background MQ](/background-mq), MQ hosts. Its design and simplicity focus offers an unparalleled suite of productivity features that can be declaratively enabled without code, from creating fully queryable Web API's with just a single Typed Request DTO with [Auto Query](/autoquery/) supporting [every major RDBMS](/ormlite/#ormlite-rdbms-providers) to the built-in support for [Auto Batched Requests](/auto-batched-requests) or effortlessly enabling rich [HTTP Caching](/http-caching) and [Encrypted Messaging](/auth/encrypted-messaging) for all your existing services via [Plugins](/plugins). Your same Services also serve as the Controller in ServiceStack's [Smart Razor Views](https://razor.netcore.io/) reducing the effort to serve both [Web and Single Page Apps](https://github.com/ServiceStackApps/LiveDemos) as well as [Rich Desktop and Mobile Clients](https://github.com/ServiceStackApps/HelloMobile) that are able to deliver instant interactive experiences using ServiceStack's real-time [Server Events](/server-events). ServiceStack Services also maximize productivity for consumers providing an [instant end-to-end typed API without code-gen](/csharp-client) enabling the most productive development experience for developing .NET to .NET Web Services. ### Benefits - **Simplicity** - All features are centered around APIs that accept and return Typed DTOs - **Speed** - Built for speed on high-performance components utilizing performance APIs available in each .NET runtime - **Web Services Best Practices** - Adopts time-tested SOA Integration Patterns for APIs and client integrations - **Message-based Services** - Model-driven, code-first, friction-free development - **Native Clients** - Clean, end-to-end typed idiomatic APIs for most major platforms - **Modern** - No XML config, IOC built-in, no code-gen, conventional defaults - **Smart** - Infers greater intelligence from your strongly typed DTOs - **Effortless Features** - Most features enhance your existing DTOs making them trivial to enable - **Multi Platform** - Supports .NET 4.5 and .NET Core platforms for hosting on Windows, OSX, Linux - **Multiple Hosts** - Run in Web, Console, native Windows/OSX Desktop Apps, Windows Services - **Host Agnostic** - Services are decoupled from HTTP and can be hosted in MQ Services - **Highly testable** - Typed, idiomatic client APIs enable succinct, intuitive Integration tests - **Mature** - Stable with over 10+ years of development - **Preserve Investment** - modern libraries that are [Continuously Improved](/release-notes-history) (not abandoned or replaced) - **Dependable** - Commercially supported and actively developed - **Increasing Value** - ServiceStack's [ever-growing features](https://servicestack.net/features) adds more capabilities around your Services with each release ### Generate Instant Typed APIs from within all Major IDEs! ServiceStack now [integrates with all Major IDE's](/add-servicestack-reference.html) used for creating the best native experiences on the most popular platforms to enable a highly productive dev workflow for consuming Web Services, making ServiceStack the ideal back-end choice for powering rich, native iPhone and iPad Apps on iOS with Swift, Mobile and Tablet Apps on the Android platform with Java, OSX Desktop Applications as well as targeting the most popular .NET PCL platforms including Xamarin.iOS, Xamarin.Android, Windows Store, WPF, WinForms and Silverlight: [![](./img/pages/servicestack-reference/ide-plugins-splash.png)](https://www.youtube.com/watch?v=JKsgrstNnYY) #### [JetBrains Rider ServiceStack Plugin](https://www.youtube.com/watch?v=JKsgrstNnYY) The **ServiceStack** Rider plugin is installable directly from JetBrains Marketplace and enables seamless integration with JetBrains Rider for easily generating C#, TypeScript, F# and VB.NET Typed APIs from just a remote ServiceStack Base URL. #### [VS.NET integration with ServiceStackVS](/create-your-first-webservice#step-1-download-and-install-servicestackvs) Providing instant Native Typed API's for [C#](/csharp-add-servicestack-reference.html), [TypeScript](/typescript-add-servicestack-reference.html), [F#](/fsharp-add-servicestack-reference.html) and [VB.NET](/vbnet-add-servicestack-reference.html) directly in Visual Studio for the [most popular .NET platforms](https://github.com/ServiceStackApps/HelloMobile) including iOS and Android using [Xamarin.iOS](https://github.com/ServiceStackApps/HelloMobile#xamarinios-client) and [Xamarin.Android](https://github.com/ServiceStackApps/HelloMobile#xamarinandroid-client) on Windows. #### [Xamarin Studio integration with ServiceStackXS](/csharp-add-servicestack-reference.html#xamarin-studio) Providing [C# Native Types](/csharp-add-servicestack-reference.html) support for developing iOS and Android mobile Apps using [Xamarin.iOS](https://github.com/ServiceStackApps/HelloMobile#xamarinios-client) and [Xamarin.Android](https://github.com/ServiceStackApps/HelloMobile#xamarinandroid-client) with [Xamarin Studio](https://www.xamarin.com/studio) on OSX. The **ServiceStackXS** plugin also provides a rich web service development experience developing Client applications with [Mono Develop on Linux](/csharp-add-servicestack-reference.html#xamarin-studio-for-linux) #### [Android Studio integration with ServiceStack Plugin](/java-add-servicestack-reference.html) Providing [an instant Native Typed API in Java](/java-add-servicestack-reference.html) and [Kotlin](/kotlin-add-servicestack-reference.html) including idiomatic Java Generic Service Clients supporting Sync and Async Requests by leveraging Android's AsyncTasks to enable the creation of services-rich and responsive native Java or Kotlin Mobile Apps on the Android platform - directly from within Android Studio! #### [JetBrains IDEs integration with ServiceStack IDEA plugin](/java-add-servicestack-reference.html#install-servicestack-idea-from-the-plugin-repository) The ServiceStack IDEA plugin is installable directly from IntelliJ's Plugin repository and enables seamless integration with IntelliJ Java Maven projects for generating a Typed API to quickly and effortlessly consume remote ServiceStack Web Services from pure cross-platform Java or Kotlin Clients. #### [Eclipse integration with ServiceStackEclipse](https://github.com/ServiceStack/ServiceStack.Java/tree/master/src/ServiceStackEclipse#eclipse-integration-with-servicestack) The unmatched productivity offered by [Java Add ServiceStack Reference](/java-add-servicestack-reference.html) is also available in the [ServiceStackEclipse IDE Plugin](https://github.com/ServiceStack/ServiceStack.Java/tree/master/src/ServiceStackEclipse#eclipse-integration-with-servicestack) that's installable from the [Eclipse MarketPlace](https://marketplace.eclipse.org/content/servicestackeclipse) to provide deep integration of Add ServiceStack Reference with Eclipse Java Maven Projects enabling Java Developers to effortlessly Add and Update the references of their evolving remote ServiceStack Web Services. #### [Simple command-line utilities for ServiceStack](/add-servicestack-reference.html#simple-command-line-utilities) In addition to our growing list of supported IDE's, the [x dotnet tool](/dotnet-tool) allows VS Code and other cross-platform IDEs, build servers, shell scripts and other automated tasks to easily Add and Update ServiceStack References with a single command. #### [Invoke ServiceStack APIs from the command-line](/post-command) Easily inspect and invoke C# .NET Web APIs from the command-line with Post Command which allows you to both inspect and call any ServiceStack API with just its name and a JS Object literal. API Responses returned in human-friendly markdown tables by default or optionally as JSON & raw HTTP. ## Simple Customer Database REST Services Example This example is also available as a [stand-alone integration test](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.WebHost.Endpoints.Tests/CustomerRestExample.cs): ```csharp //Web Service Host Configuration public class AppHost : AppSelfHostBase { public AppHost() : base("Customer REST Example", typeof(CustomerService).Assembly) {} public override void Configure(Container container) { //Register which RDBMS provider to use container.Register(c => new OrmLiteConnectionFactory(":memory:", SqliteDialect.Provider)); using (var db = container.Resolve().Open()) { //Create the Customer POCO table if it doesn't already exist db.CreateTableIfNotExists(); } } } //Web Service DTOs [Route("/customers", "GET")] public class GetCustomers : IReturn {} public class GetCustomersResponse { public List Results { get; set; } } [Route("/customers/{Id}", "GET")] public class GetCustomer : IReturn { public int Id { get; set; } } [Route("/customers", "POST")] public class CreateCustomer : IReturn { public string Name { get; set; } } [Route("/customers/{Id}", "PUT")] public class UpdateCustomer : IReturn { public int Id { get; set; } public string Name { get; set; } } [Route("/customers/{Id}", "DELETE")] public class DeleteCustomer : IReturnVoid { public int Id { get; set; } } // POCO DB Model public class Customer { [AutoIncrement] public int Id { get; set; } public string Name { get; set; } } //Web Services Implementation public class CustomerService : Service { public object Get(GetCustomers request) { return new GetCustomersResponse { Results = Db.Select() }; } public object Get(GetCustomer request) { return Db.SingleById(request.Id); } public object Post(CreateCustomer request) { var customer = new Customer { Name = request.Name }; Db.Save(customer); return customer; } public object Put(UpdateCustomer request) { var customer = Db.SingleById(request.Id); if (customer == null) throw HttpError.NotFound($"Customer '{request.Id}' does not exist"); customer.Name = request.Name; Db.Update(customer); return customer; } public void Delete(DeleteCustomer request) { Db.DeleteById(request.Id); } } ``` ### [Calling the above REST Service from any C#/.NET Client](/csharp-add-servicestack-reference.html) No code-gen required, can re-use above Server DTOs: ```csharp var client = new JsonApiClient(BaseUri); //GET /customers var all = client.Get(new GetCustomers()); // Count = 0 //POST /customers var customer = client.Post(new CreateCustomer { Name = "Foo" }); //GET /customer/1 customer = client.Get(new GetCustomer { Id = customer.Id }); // Name = Foo //GET /customers all = client.Get(new GetCustomers()); // Count = 1 //PUT /customers/1 customer = client.Put( new UpdateCustomer { Id = customer.Id, Name = "Bar" }); // Name = Bar //DELETE /customers/1 client.Delete(new DeleteCustomer { Id = customer.Id }); //GET /customers all = client.Get(new GetCustomers()); // Count = 0 ``` Same code also works with [Android, iOS, Xamarin.Forms, UWP and WPF clients](https://github.com/ServiceStackApps/HelloMobile). ::: info [F#](/fsharp-add-servicestack-reference.html) and [VB.NET](/vbnet-add-servicestack-reference.html) can re-use same [.NET Service Clients](/csharp-client.html) and DTOs ::: ### [Calling from TypeScript](/typescript-add-servicestack-reference.html#ideal-typed-message-based-api) ```ts const client = new JsonServiceClient(baseUrl); const { results } = await client.get(new GetCustomers()); ``` ### [Calling from Swift](/swift-add-servicestack-reference.html#jsonserviceclientswift) ```swift let client = JsonServiceClient(baseUrl: BaseUri) client.getAsync(GetCustomers()) .then { let results = $0.results; } ``` ### [Calling from Java](/java-add-servicestack-reference.html#jsonserviceclient-usage) ```java JsonServiceClient client = new JsonServiceClient(BaseUri); GetCustomersResponse response = client.get(new GetCustomers()); List results = response.results; ``` ### [Calling from Kotlin](/kotlin-add-servicestack-reference.html#jsonserviceclient-usage) ```kotlin val client = JsonServiceClient(BaseUri) val response = client.get(GetCustomers()) val results = response.results ``` ### Calling the from [Dart](/dart-add-servicestack-reference#example-usage) ```dart var client = new JsonServiceClient(baseUri); var response = await client.get(new GetCustomers()); ``` ### [Calling from jQuery using TypeScript Definitions](/typescript-add-servicestack-reference.html#typescript-interface-definitions) ```js $.getJSON($.ss.createUrl("/customers", request), request, function (r: dtos.GetCustomersResponse) { alert(r.Results.length == 1); }); ``` ### Calling from jQuery ```js $.getJSON(baseUri + "/customers", function(r) { alert(r.Results.length == 1); }); ``` That's all the application code required to create and consume a simple database-enabled REST Web Service! ### Define web services following Martin Fowlers Data Transfer Object Pattern ServiceStack was heavily influenced by [**Martin Fowlers Data Transfer Object Pattern**](http://martinfowler.com/eaaCatalog/dataTransferObject): >When you're working with a remote interface, such as Remote Facade (388), each call to it is expensive. >As a result you need to reduce the number of calls, and that means that you need to transfer more data >with each call. One way to do this is to use lots of parameters. >However, this is often awkward to program - indeed, it's often impossible with languages such as Java >that return only a single value. > >The solution is to create a Data Transfer Object that can hold all the data for the call. It needs to be serializable to go across the connection. >Usually an assembler is used on the server side to transfer data between the DTO and any domain objects. The Request- and Response DTO's used to define web services in ServiceStack are standard POCO's while the implementation just needs to inherit from a testable and dependency-free `IService` marker interface. As a bonus for keeping your DTO's in a separate dependency-free .dll, you're able to re-use them in your C#/.NET clients providing a strongly-typed API without any code-gen what-so-ever. Also your DTO's *define everything* ServiceStack does not pollute your web services with any additional custom artifacts or markup. ### Multiple Clients Our generic Service clients covers the most popular Mobile, Desktop and Server platforms with first-class implementations for Xamarin, Android, Java and TypeScript which now includes: - [.NET Service Clients](/csharp-client) - C# / VB.NET / F# - .NET Core 2.1+ - .NET Framework 4.5+ - Blazor WASM - Xamarin.iOS - Xamarin.Android - UWP - Silverlight - [TypeScript Service Client](/typescript-add-servicestack-reference#typescript-serviceclient) - Web - Node.js Server - React Native - iOS - Android - [Python Service Client](/python-add-servicestack-reference) - [Dart](/dart-add-servicestack-reference) - Flutter - iOS - Android - Web / Angular.dart - [Java Service Client](/java-add-servicestack-reference#jsonserviceclient-api) - Android - JVM 1.7+ (Java, Kotlin, Scala, etc) - Java Clients - Java Servers - [Kotlin Service Client](/kotlin-add-servicestack-reference) - [Swift Service Client](/swift-add-servicestack-reference#swift-client-usage) - iOS - OSX - [Swift Package Manager Apps](https://github.com/ServiceStackApps/swift-techstacks-console) - [JavaScript (jQuery)](/ss-utils-js) - Web - [MQ Clients](/messaging#mq-client-architecture) - Background MQ - Rabbit MQ - Redis MQ - Amazon SQS - Azure Service Bus ### Multiple pluggable Formats ServiceStack re-uses the custom artifacts above and with zero-config and without imposing any extra burden on the developer adds discoverability and provides hosting of your web service on a number of different formats, including: - [JSON]/json-format) - XML - [JSV](/jsv-format) - [CSV](/csv-format) - [MsgPack](/messagepack-format) - [ProtoBuf](/protobuf-format) - [gRPC](/grpc/) - [SOAP 1.1/1.2](/soap-support) - HTML - [HTML5 Report Format](/html5reportformat) - [Sharp Pages](https://sharpscript.net/docs/script-pages) - [Razor](https://razor.netcore.io/) - [Markdown Razor](/markdown-razor) ### Multiple Endpoints Whilst ServiceStack is fundamentally a premier HTTP Framework, its Services can also be consumed from new [gRPC](/grpc/) as well as legacy [SOAP 1.1 and 1.2](/soap-support) endpoints as well as a number of [MQ Servers](/messaging): - [Background MQ Service](/background-mq) - [Rabbit MQ Server](/rabbit-mq) - [Redis MQ Server](/redis-mq) - [Amazon SQS MQ Server](/amazon-sqs-mq) - [Azure Service Bus MQ](/azure-service-bus-mq) ### Multiple Hosting Options In addition to supporting multiple formats and endpoints, ServiceStack can also be hosted within a multitude of different hosting options: #### Windows, OSX or Linux - **.NET Core 2.1+** - [Web App or SelfHost](https://github.com/NetCoreApps/LiveDemos#servicestack-net-core-live-demos) - [Worker Service](/messaging#worker-service-templates) #### Windows - **.NET Framework 4.5+** - [ASP.NET Core 2.1 LTS](/templates/corefx) - [Classic ASP.NET System.Web](https://github.com/ServiceStackApps/LiveDemos#live-servicestack-demos) - [Stand-alone, Self-Hosted HttpListener](/self-hosting) - [Stand-alone Windows Service](/templates/windows-service) - [Hosted inside WinForms with Chromium Embedded Framework](https://github.com/ServiceStack/ServiceStack.Gap#winforms-with-chromium-embedded-framework) - [Windows and Azure Service Fabric](https://github.com/ServiceStackApps/HelloServiceFabric) #### OSX - [Hosted inside Mac OSX Cocoa App with Xamarin.Mac](https://github.com/ServiceStack/ServiceStack.Gap#mac-osx-cocoa-app-with-xmarainmac) ### Target Multiple platforms With multi-targeted projects creating both .NET Framework and .NET Standard builds you can optionally run your same ServiceStack App on multiple platforms as seen with the [Hello Mobile Shared Gateway](/releases/v5_0_0#run-aspnet-core-apps-on-the-net-framework) project where its same shared [ServiceStack Server.Common project](https://github.com/ServiceStackApps/HelloMobile#servicestack-server-app) is used to host the same App running on: - [Server.NetCore](https://github.com/ServiceStackApps/HelloMobile/tree/master/src/Server.NetCore) - hosting the ServiceStack Services in a **ASP.NET Core 2.1 App** - [Server.NetCoreFx](https://github.com/ServiceStackApps/HelloMobile/tree/master/src/Server.NetCoreFx) - hosting in a **ASP.NET Core App** on the **.NET Framework** - [Server.AspNet](https://github.com/ServiceStackApps/HelloMobile/tree/master/src/Server.AspNet) - hosting classic **ASP.NET Framework** Web Applications - [Server.HttpListener](https://github.com/ServiceStackApps/HelloMobile/tree/master/src/Server.HttpListener) - host in a .NET Framework Self-Hosting **HttpListener** AppHost ### VS.NET Templates There's a [VS.NET Template](/templates/) for creating solutions targeting most of the above platforms. E.g. the [React Desktop Apps](https://github.com/ServiceStackApps/ReactDesktopApps) VS.NET Template provides an easy and integrated way to host a Single Page React App on multiple platforms. ## Goals of Service Design The primary benefits of Services are that they offer the highest level of software re-use, they're [Real Computers all the way down](https://mythz.servicestack.net/#messaging) retaining the ability to represent anything. Especially at this level, encapsulation and its external interactions are paramount which sees the [Service Layer as its most important Contract](http://stackoverflow.com/a/15369736/85785), constantly evolving to support new capabilities whilst serving and outliving its many consumers. Extra special attention should be given to Service design with the primary goals of exposing its capabilities behind [consistent and self-describing](/why-servicestack#goals-of-service-design), intent-based [tell-dont-ask](https://pragprog.com/articles/tell-dont-ask) APIs. A Services ability to encapsulate complexity is what empowers consumers to be able to perform higher-level tasks like provisioning a cluster of AWS servers or being able to send a tweet to millions of followers in seconds with just a simple HTTP request, i.e. being able to re-use existing hardened functionality without the required effort, resources and infrastructure to facilitate the request yourself. To maximize accessibility it's recommended for Service Interfaces to be orientated around resources and verbs, retain a flat structure, customizable with key value pairs so they're accessible via the built-in QueryString and FormData support present in all HTTP clients, from HTML Forms to command-line utilities like [curl](https://curl.haxx.se). ### WCF the anti-DTO Web Services Framework Unfortunately this best-practices convention is effectively discouraged by Microsoft's WCF SOAP Web Services framework as they encourage you to develop API-specific RPC method calls by mandating the use of method signatures to define your web services API. This results in less re-usable, more client-specific APIs that encourages more remote method calls. Unhappy with this perceived anti-pattern in WCF, ServiceStack was born providing a Web Service framework that embraces best-practices for calling remote services, using config-free, convention-based DTO's. ### Encourages development of message-style, re-usable and batch-full web services Entire POCO types are used to define the request- and response DTO's to promote the creation well-defined coarse-grained web services. Message-based interfaces are best-practices when dealing with out-of-process calls as they can batch more work using less network calls and are ultimately more re-usable as the same operation can be called using different calling semantics. This is in stark contrast to WCF's Operation or Service contracts which encourage RPC-style, application-specific web services by using method signatures to define each operation. As it stands in general-purpose computing today, there is nothing more expensive you can do than a remote network call. Although easier for the newbie developer, by using _methods_ to define web service operations, WCF is promoting bad-practices by encouraging them to design and treat web-service calls like normal function calls even though they are millions of times slower. Especially at the app-server tier, nothing hurts performance and scalability of your client and server than multiple dependent and synchronous web service calls. Batch-full, message-based web services are ideally suited in development of SOA services as they result in fewer, richer and more re-usable web services that need to be maintained. RPC-style services normally manifest themselves from a *client perspective* that is the result of the requirements of a single applications data access scenario. Single applications come and go over time while your data and services are poised to hang around for the longer term. Ideally you want to think about the definition of your web service from a *services and data perspective* and how you can expose your data so it is more re-usable by a number of your clients. ## Difference between an RPC-chatty and message-based API ```csharp public interface IWcfCustomerService { Customer GetCustomerById(int id); List GetCustomerByIds(int[] id); Customer GetCustomerByUserName(string userName); List GetCustomerByUserNames(string[] userNames); Customer GetCustomerByEmail(string email); List GetCustomerByEmails(string[] emails); } ``` ### contrast with an equivalent message based service: ```csharp public class Customers : IReturn> { public int[] Ids { get; set; } public string[] UserNames { get; set; } public string[] Emails { get; set; } } ``` **Any combination of the above can be fulfilled by 1 remote call, by the same single web service - i.e what ServiceStack encourages!** Fewer and more batch-full services require less maintenance and promote the development of more re-usable and efficient services. In addition, message APIs are much more resilient to changes as you're able to safely add more functionality or return more data without breaking or needing to re-gen existing clients. Message-based APIs also lend them better for cached, asynchronous, deferred, proxied and reliable execution with the use of brokers and proxies. Comparatively there is almost no win for a remote RPC API, except to maybe [hide a remote service even exists](https://en.wikipedia.org/wiki/Fallacies_of_Distributed_Computing) by making a remote call look like a method call even though they're millions of times slower, leading new developers to develop inefficient, brittle systems from the start. # Architecture Overview Source: https://docs.servicestack.net/architecture-overview Ultimately behind-the-scenes ServiceStack is just built on top of ASP.NET's Raw [IHttpAsyncHandler](https://msdn.microsoft.com/en-us/library/ms227433.aspx). Existing abstractions and [xmlconfig-encumbered legacy ASP.NET providers](http://mono.servicestack.net/mvc-powerpack/) have been abandoned, in favour of fresh, simple and clean [Caching](/caching), [Session](/auth/sessions) and [Authentication](/auth/authentication-and-authorization) providers all based on clean POCOs, supporting multiple back-ends and all working seamlessly together. Our best-practices architecture is purposely kept simple, introduces minimal new concepts or artificial constructs that can all be eloquently captured in the diagram below: ## Server Architecture ![ServiceStack Logical Architecture View](/img/pages/overview/servicestack-logical-view-02.png) ## Client Architecture ServiceStack's [Message-based design](/advantages-of-message-based-web-services) allows us to easily support [typed, generic and re-usable Service Clients](/clients-overview) for all our popular formats: ![ServiceStack HTTP Client Architecture](/img/pages/overview/servicestack-httpclients.png) Having all clients share the same interface allow them to be hot-swappable at run-time without code changes and keep them highly testable where the same unit test can also [serve as an XML, JSON, JSV, SOAP Integration Test](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.WebHost.IntegrationTests/Tests/WebServicesTests.cs). By promoting clean (endpoint-ignorant and dependency-free) Service and DTO classes, your web services are instantly re-usable and can be hosted in non-http contexts as well. E.g. The client architecture when one of the [built-in MQ Host is enabled](/redis-mq): ![ServiceStack MQ Client Architecture](/img/pages/overview/servicestack-mqclients.png) ## Implementation The entry point for all ASP.NET and HttpListener requests is in the [ServiceStack.HttpHandlerFactory](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/HttpHandlerFactory.cs) whose purpose is to return the appropriate IHttpHandler for the incoming request. There are 2 distinct modes in any ServiceStack application: 1. AppHost Setup and Configuration - Only done once for all services. Run only once on App StartUp. 1. Runtime - Run on every request: uses dependencies, plugins, etc. defined in the AppHost. Each new request re-binds all IOC dependencies to a new service instance which gets disposed at the end of each request. The implementation of this can be visualized below: ![ServiceStack Overview](/img/pages/overview/servicestack-overview-01.png) After the `IHttpHandler` is returned, it gets executed with the current ASP.NET or HttpListener request wrapped in a common [IRequest](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Web/IRequest.cs) instance. # Instantly Servicify existing Systems Source: https://docs.servicestack.net/servicify In addition to [AutoQuery](/autoquery/rdbms) automatically providing your Services implementations, [Studio](/studio) providing its instant UI, ServiceStack also gained the capability to **[generate your entire API](/autoquery/autogen)** including Typed API contracts, data models, implementations & human-friendly pluralized HTTP API routes over an existing System RDBMS's tables. ## AutoGen ServiceStack's [AutoGen](/autoquery/autogen) enables a number of exciting possibilities, predominantly it's the fastest way to ServiceStack-ify an existing systems RDBMS where it will serve as an invaluable tool for anyone wanting to quickly migrate to ServiceStack and access its functionality ecosystem around ServiceStack Services: **[AutoGen's](/autoquery/autogen)** code generation is programmatically customizable where the generated types can be easily augmented with additional declarative attributes to inject your App's conventions into the auto generated Services & Types to apply custom behavior like Authorization & additional validation rules. After codifying your system conventions the generated classes can optionally be "ejected" where code-first development can continue as normal. This feature enables rewriting parts or modernizing legacy systems with the least amount of time & effort, once Servicified you can take advantage of declarative features like Multitenancy, Optimistic Concurrency & Validation, enable automatic features like Executable Audit History, allow business users to maintain validation rules in its RDBMS, manage them through **Studio** & have them applied instantly at runtime and visibly surfaced through ServiceStack's myriad of [client UI auto-binding options](/world-validation). **Studio** can then enable stakeholders with an instant UI to quickly access and search through their data, import custom queries directly into Excel or access them in other registered Content Types through a custom UI where fine-grained app-level access can be applied to customize which tables & operations different users have. ### gRPC's Typed protoc Universe **AutoGen** also enables access to ServiceStack's ecosystem of metadata services & connectivity options where it's now become the **fastest way to generate gRPC endpoints** over an existing system. This is especially exciting as in addition to enabling high-performance connectivity to your Systems data, it opens it up to [all languages in gRPC's protoc universe](https://grpc.io/docs/languages/). Whilst the Smart, Generic [C# / F# / VB.NET Service Clients](/grpc/generic) continue to provide the best UX for consuming gRPC Services, one of the nicest **protoc generated** clients languages is [Dart](http://dart.dev) - a modern high-level language with native class performance & script-like productivity where individual source files can be run immediately without compilation, it's quality tooling, static analysis & high-level features like async/await make it an ideal exploratory language for consuming gRPC endpoints. ### Dart gRPC Script Playground This quick demo shows an example of instantly Servicifying a database & accesses it via gRPC in minutes, starting with a new [grpc](https://github.com/NetCoreTemplates/grpc) project from scratch, it [mixes](/mix-tool) in [autocrudgen](https://gist.github.com/gistlyn/464a80c15cb3af4f41db7810082dc00c) to configure **AutoGen** to generate AutoQuery services for the registered [sqlite](https://gist.github.com/gistlyn/768d7b330b8c977f43310b954ceea668) RDBMS that's copied into the project from the [northwind.sqlite](https://gist.github.com/gistlyn/97d0bcd3ebd582e06c85f8400683e037) gist. Once the servicified App is running it accesses the gRPC Services in a new Dart Console App using the UX-friendly [Dart gRPC support in the x dotnet tool](/grpc/dart) to call the protoc generated Services: > YouTube: [youtu.be/5NNCaWMviXU](https://youtu.be/5NNCaWMviXU) [![](/img/pages/release-notes/v5.9/autogen-grpc.png)](https://youtu.be/5NNCaWMviXU) ### Flutter gRPC Android App And if you can access it from Dart, you can access it from all platforms Dart runs on - the most exciting is Google's [Flutter](https://flutter.dev) UI Kit for building beautiful, natively compiled applications for Mobile, Web, and Desktop from a single codebase: > YouTube: [youtu.be/3iz9aM1AlGA](https://youtu.be/3iz9aM1AlGA) [![](/img/pages/release-notes/v5.9/autogen-grpc-flutter.jpg)](https://youtu.be/3iz9aM1AlGA) ## React Native Typed Client gRPC is just [one of the endpoints ServiceStack Services](/why-servicestack#multiple-clients) can be accessed from, for an even richer & more integrated development UX they're also available in all popular Mobile, Web & Desktop languages [Add ServiceStack Reference](/add-servicestack-reference) supports. Like [TypeScript](/typescript-add-servicestack-reference) which can be used in Browser & Node TypeScript code-bases as well as JavaScript-only code-bases like [React Native](https://reactnative.dev) - a highly productive Reactive UI for developing iOS and Android Apps: [![](/img/pages/release-notes/v5.9/autogen-react-native.png)](https://youtu.be/6-SiLAbY63w) ::: info YouTube [youtu.be/6-SiLAbY63w](https://youtu.be/6-SiLAbY63w) ::: # Explore ServiceStack Source: https://docs.servicestack.net/explore-servicestack If you're completely new to ServiceStack, the [YouTube channel](https://www.youtube.com/channel/UC0kXKGVU4NHcwNdDdRiAJSA/videos) is a great way to explore some of the possibilities easily enabled with ServiceStack: [![](/img/pages/overview/servicestack-youtube.png)](https://www.youtube.com/channel/UC0kXKGVU4NHcwNdDdRiAJSA/videos) ## Explore ServiceStack Apps A great way to learn new technology is to explore existing Apps built with it, where for ServiceStack you can find a number of simple focused Apps at: ### [.NET Core Apps](https://github.com/NetCoreApps/LiveDemos) ### [.NET Framework Apps](https://github.com/ServiceStackApps/LiveDemos#live-servicestack-demos) ### [Sharp Apps](https://sharpscript.net/sharp-apps/app-index) Many Apps are well documented like [World Validation](/world-validation) which covers how to re-implement a simple Contacts App UI in **10 popular Web Development approaches** - all calling the same ServiceStack Services. ServiceStack is a single code-base implementation that supports [.NET's most popular Server platforms](/why-servicestack#multiple-hosting-options) with near perfect source-code [compatibility with .NET Core](/netcore) so all .NET Frameworks Apps are still relevant in .NET Core, e.g. the [EmailContacts guidance](https://github.com/ServiceStackApps/EmailContacts) walks through the recommended setup and physical layout structure of typical medium-sized ServiceStack projects, including complete documentation of how to create the solution from scratch, whilst explaining all the ServiceStack features it makes use of along the way. ## Starting Project Templates Once you've familiarized yourself with ServiceStack and are ready to use it in action, get started with a customized starting project template from our online template builder at:

servicestack.net/start

[![](/img/pages/overview/servicestack-start.png)](https://servicestack.net/start) # ServiceStack v8.9 Source: https://docs.servicestack.net/releases/v8_09
![](/img/pages/release-notes/v8.9/bg.webp) We're happy to announce ServiceStack v8.9 - a major release packed with lots of new features and improvements across the board. Unfortunately these release notes have become apologetically long, so we've added a Table of Contents below to make it easier to jump to the biggest features you're interested in: - [OrmLite's new Configuration Model and Defaults](#ormlites-new-configuration-model-and-defaults) - [RDBMS Async Tasks Builder](#rdbms-async-tasks-builder) - [RDBMS Background Jobs (PostgreSQL, SQL Server, MySql/MariaDB)](#rdbms-background-jobs) - [RDBMS Request Logging and Analytics (PostgreSQL, SQL Server, MySql/MariaDB)](#rdbms-request-logging-and-analytics) - [Protect same APIs with API Keys or Identity Auth](#protect-same-apis-with-api-keys-or-identity-auth) - [AI Chat - OpenAI Chat Compatible API and Server Gateway](#ai-chat) - [AI Chat UI - Customizable, Private, ChatGPT-like UI](#ai-chat-ui) - [ChatCompletion custom API Explorer UI](#creating-a-custom-explorer-ui-for-openais-chat-api) - [XSS Vulnerability fixed in HtmlFormat.html](#xss-vulnerability-fixed-in-htmlformat.html) ## OrmLite's new Configuration Model and Defaults In continuing with ServiceStack's [seamless integration with the ASP.NET Framework](https://docs.servicestack.net/releases/v8_01), providing a familiar development experience that follows .NET configuration model and Entity Framework conventions has become a priority. Implementing a new configuration model also gives us the freedom to change OrmLite's defaults which wasn't possible before given the paramount importance of maintaining backwards compatibility in a data access library that accesses existing Customer data. #### JSON used for Complex Types The biggest change that applies to all RDBMS providers is replacing the JSV serialization used for serializing Complex Types with JSON now that most RDBMS have native support for JSON. #### PostgreSQL uses default Naming Strategy The biggest change to PostgreSQL is using the same default naming strategy as other RDBMS which matches EF's convention that's used for ASP .NET's Identity Auth tables. #### SQL Server uses latest 2022 Dialect SQL Server now defaults to the latest SqlServer 2022 dialect which is also compatible with SQL Server 2016+ ## New Configuration Model OrmLite new modern, fluent configuration API aligns with ASP.NET Core's familiar `services.Add*()` pattern. This new approach provides a more intuitive and discoverable way to configure your database connections, with strongly-typed options for each RDBMS provider. The new configuration model starts with the `AddOrmLite()` extension method to configure its default `IDbConnectionFactory` dependency by combining it with RDBMS provider-specific methods for the RDBMS you wish to use: - `UseSqlite()` in **ServiceStack.OrmLite.Sqlite.Data** - `UsePostgres()` in **ServiceStack.OrmLite.PostgreSQL** - `UseSqlServer()` in **ServiceStack.OrmLite.SqlServer.Data** - `UseMySql()` in **ServiceStack.OrmLite.MySql** - `UseMySqlConnector()` in **ServiceStack.OrmLite.MySqlConnector** - `UseOracle()` in **ServiceStack.OrmLite.Oracle** (community supported) - `UseFirebird()` in **ServiceStack.OrmLite.Firebird** (community supported) Each provider method accepts a connection string and an optional configuration callback that lets you customize the dialect's behavior with IntelliSense support. It's an alternative approach to manually instantiating `OrmLiteConnectionFactory` with specific dialect providers, offering better discoverability and a more consistent experience across different database providers. ### SQLite ```csharp services.AddOrmLite(options => options.UseSqlite(connectionString)); ``` Each RDBMS provider can be further customized to change its defaults with: ```csharp services.AddOrmLite(options => options.UseSqlite(connectionString, dialect => { // Default SQLite Configuration: dialect.UseJson = true; dialect.UseUtc = true; dialect.EnableWal = true; dialect.EnableForeignKeys = true; dialect.BusyTimeout = TimeSpan.FromSeconds(30); }) ); ``` ### PostgreSQL ```csharp services.AddOrmLite(options => options.UsePostgres(connectionString)); ``` With Dialect Configuration: ```csharp services.AddOrmLite(options => options.UsePostgres(connString, dialect => { // Default PostgreSQL Configuration: dialect.UseJson = true; dialect.NamingStrategy = new OrmLiteNamingStrategyBase(); }) ); ``` ### Removed snake_case naming strategy :::{.float-right .-mt-24! .-mr-12! .max-w-xs .pl-4} ![](/img/pages/release-notes/v8.9/postgres-naming-strategy.webp) ::: PostgreSQL now defaults to using the same naming strategy as other RDBMS, i.e. no naming strategy, and uses the PascalCase naming of C# classes as-is. With this change OrmLite's table and columns now follow EF's convention which is used for ASP.NET's Identity Auth tables. This is more fragile in PostgreSQL as it forces needing to use quoted table and column names for all queries, e.g. ```sql SELECT "MyColumn" FROM "MyTable" ``` This is required as PostgreSQL isn't case-insensitive and lowercases all unquoted symbols, e.g: ```sql SELECT MyColumn FROM MyTable -- Translates to: SELECT mycolumn FROM mytable ``` This is already done by OrmLite, but any custom queries also need to use quoted symbols. ### SQL Server ```csharp services.AddOrmLite(options => options.UseSqlServer(connectionString)); ``` With Dialect Configuration: ```csharp services.AddOrmLite(options => options.UseSqlServer(connString, dialect => { // Default SQL Server Configuration: dialect.UseJson = true; }) ); ``` ### Uses Latest SQL Server at each .NET LTS Release To keep it modern and predictable, this will use the latest SQL Server Dialect that was released at the time of each major .NET LTS versions, currently `SqlServer2022OrmLiteDialectProvider`, which we'll keep until the next .NET LTS release. Although the **2022** dialect is also compatible with every SQL Server version from **2016+**. To use an explicit version of SQL Server you can use the generic overload that best matches your version: ```csharp services.AddOrmLite(options => options.UseSqlServer(connString)); ``` ### MySQL ```csharp services.AddOrmLite(options => options.UseMySql(connectionString)); ``` With Dialect Configuration: ```csharp services.AddOrmLite(options => options.UseMySql(connectionString, dialect => { // Default MySql Configuration: dialect.UseJson = true; }) ); ``` For MySqlConnector use: ```csharp services.AddOrmLite(options => options.AddMySqlConnector(connectionString)); ``` ### Named Connections The new OrmLite configuration model also streamlines support for named connections, allowing you to register multiple database connections with unique identifiers in a single fluent configuration chain, e.g: ```csharp services.AddOrmLite(options => { options.UseSqlite(":memory:") .ConfigureJson(json => { json.DefaultSerializer = JsonSerializerType.ServiceStackJson; }); }) .AddSqlite("db1", "db1.sqlite") .AddSqlite("db2", "db2.sqlite") .AddPostgres("reporting", PostgreSqlDb.Connection) .AddSqlServer("analytics", SqlServerDb.Connection) .AddSqlServer( "legacy-analytics", SqlServerDb.Connection) .AddMySql("wordpress", MySqlDb.Connection) .AddMySqlConnector("drupal", MySqlDb.Connection) .AddOracle("enterprise", OracleDb.Connection) .AddFirebird("firebird", FirebirdDb.Connection); ``` ### Complex Type JSON Serialization Previously OrmLite only supported serializing Complex Types with a [single Complex Type Serializer](https://docs.servicestack.net/ormlite/complex-type-serializers) but the new configuration model now uses a more configurable `JsonComplexTypeSerializer` where you can change the default JSON Serializer OrmLite should use for serializing Complex Types as well as fine-grain control over which types should use which serializer by using the `ConfigureJson()` extension method on each provider. ```csharp services.AddOrmLite(options => { options.UsePostgres(connectionString) .ConfigureJson(json => { // Default JSON Complex Type Serializer Configuration json.DefaultSerializer = JsonSerializerType.ServiceStackJson; json.JsonObjectTypes = [ typeof(object), typeof(List), typeof(Dictionary), ]; json.SystemJsonTypes = []; json.ServiceStackJsonTypes = []; }); }) ``` By default OrmLite uses **ServiceStack.Text** JSON Serializer which is less strict and more resilient than System.Text.Json for handling versioning of Types that change over time, e.g. an `int` Property that's later changed to a `string`. In addition to configuring a default you can also configure which types should be serialized with which serializer. So we could change OrmLite to use **System.Text.Json** for all types except for `ChatCompletion` which we want to use **ServiceStack.Text** JSON for: ```csharp services.AddOrmLite(options => { options.UsePostgres(connectionString) .ConfigureJson(json => { json.DefaultSerializer = JsonSerializerType.SystemJson; json.ServiceStackJsonTypes = [ typeof(ChatCompletion) ]; }); }) ``` #### Unstructured JSON with JSON Object The default Exception to this is for serialization of `object`, `List` and `Dictionary` types which is better handled by [#Script's JSON Parser](https://docs.servicestack.net/js-utils) which is able to parse any valid adhoc JSON into untyped .NET generic collections, which is both mutable and able to [utilize C# pattern matching](https://docs.servicestack.net/js-utils#getting-the-client_id-in-a-comfyui-output) for easy introspection. The new `TryGetValue` extension method on `Dictionary` makes it even more convenient for parsing adhoc JSON which can use the `out` Type parameter to reduce unnecessary type checking, e.g. here's a simple example of parsing a ComfyUI Output for the client_id used in a generation: ```csharp var comfyOutput = JSON.ParseObject(json); var prompt = (Dictionary)result.Values.First()!; if (prompt.TryGetValue("prompt", out List tuple) && tuple.Count > 3) { if (tuple[3] is Dictionary extraData && extraData.TryGetValue("client_id", out string clientId)) { Console.WriteLine(clientId); } } ``` Where as an Equivalent implementation using System.Text.Json `JsonDocument` would look like: ```csharp using System.Text.Json; var jsonDocument = JsonDocument.Parse(json); var root = jsonDocument.RootElement; // Get the first property value (equivalent to result.Values.First()) var firstProperty = root.EnumerateObject().FirstOrDefault(); if (firstProperty.Value.ValueKind == JsonValueKind.Object) { var prompt = firstProperty.Value; if (prompt.TryGetProperty("prompt", out var promptElement) && promptElement.ValueKind == JsonValueKind.Array) { var promptArray = promptElement.EnumerateArray().ToArray(); if (promptArray.Length > 3) { var extraDataElement = promptArray[3]; if (extraDataElement.ValueKind == JsonValueKind.Object && extraDataElement.TryGetProperty("client_id", out var clientIdElement) && clientIdElement.ValueKind == JsonValueKind.String) { var clientId = clientIdElement.GetString(); Console.WriteLine(clientId); } } } } ``` ### Table Aliases One potential breaking change is that table aliases are used verbatim and no longer uses a naming strategy for transforming its name which potentially affects PostgreSQL when an Alias is used that doesn't match the name of the table, e.g: ```csharp [Alias("MyTable")] //= "MyTable" public class NewMyTable { ... } [Alias("MyTable")] //= my_table public class OldMyTable { ... } ``` Aliases should either be changed to the Table name you want to use or you can use the Naming Strategy Alias dictionaries for finer-grain control over what Schema, Table, Column Names and Aliases should be used: ```csharp services.AddOrmLite(options => options.UsePostgres(connString, dialect => { dialect.NamingStrategy.TableAliases["MyTable"] = "my_table"; dialect.NamingStrategy.SchemaAliases["MySchema"] = "my_schema"; dialect.NamingStrategy.ColumnAliases["MyColumn"] = "my_columnt"; })); ``` ### Table Refs A significant internal refactor of OrmLite was done to encapsulate different ways of referring to a table in a single `TableRef` struct, which is now used in all APIs that need a table reference. The new `TableRef` struct allows for unified APIs that encapsulates different ways of referencing a table: - Type `new TableRef(typeof(MyTable))` - Model Definition `new TableRef(ModelDefinition.Definition)` - Table Name `new TableRef("MySchema")` - Schema and Table Name `new TableRef("MySchema", "MyTable"))` - Quoted Name (use verbatim) `TableRef.Literal("\"MyTable\"")` - Implicitly casts to a string as `new TableRef("MySchema")`. OrmLite handles differences between different RDBMS Providers via its `IOrmLiteDialectProvider` interface. Previously OrmLite used to maintain multiple overloads for handling some of these differences in referencing a table but they've now all been consolidated into use a single `TableRef` parameter: ```csharp public interface IOrmLiteDialectProvider { bool DoesTableExist(IDbConnection db, TableRef tableRef); string GetTableNameOnly(TableRef tableRef); string UnquotedTable(TableRef tableRef); string GetSchemaName(TableRef tableRef); string QuoteTable(TableRef tableRef); bool DoesTableExist(IDbConnection db, TableRef tableRef); bool DoesColumnExist(IDbConnection db, string columnName, TableRef tableRef); string ToAddColumnStatement(TableRef tableRef, FieldDefinition fieldDef); string ToAlterColumnStatement(TableRef tableRef, FieldDefinition fieldDef); string ToChangeColumnNameStatement(TableRef tableRef, FieldDefinition fieldDef, string oldColumn); string ToRenameColumnStatement(TableRef tableRef, string oldColumn, string newColumn); string ToDropColumnStatement(TableRef tableRef, string column); string ToDropConstraintStatement(TableRef tableRef, string constraint); string ToDropForeignKeyStatement(TableRef tableRef, string foreignKeyName); } ``` For example the `QuoteTable(TableRef)` method can be used to quote a table. Assuming our dialect was configured with the `my_table` Table Aliases, these are the results for the different ways of referencing `MyTable`: ```csharp dialect.QuoteTable("MyTable") //= "my_table" (implicit) dialect.QuoteTable(new("MyTable")) //= "my_table" dialect.QuoteTable(new("MySchema","MyTable")) //= "my_schema"."my_table" dialect.QuoteTable(TableRef.Literal("\"MyTable\"")) //= "MyTable" (verbatim) dialect.QuoteTable(new(typeof(MyTable))) //= "my_table" dialect.QuoteTable(new(ModelDefinition.Definition)) //= "my_table" ``` ### Improved Observability Significant effort was put into improving OrmLite's Observability where OrmLite's DB Connections can now be tagged to make them easier to track in hooks, logs and traces. To achieve this a new `Action` configuration callbacks were added to OrmLite Open Connection APIs which is invoked before a DB Connection is opened, e.g: ```csharp using var db = dbFactory.Open(configure: db => db.WithTag("MyTag")); using var db = dbFactory.Open(namedConnection, configure: db => db.WithTag("MyTag")); using var db = HostContext.AppHost.GetDbConnection(req, configure: db => db.WithTag("MyTag")); ``` Which ServiceStack uses internally to tag DB Connections with the feature executing it, or for `Db` connections used in Services it will tag it with the Request DTO Name. :::{.wideshot} ![](/img/pages/release-notes/v8.9/ormlite-tags.webp) ::: If a tag is configured, it's also included in OrmLite's Debug Logging output, e.g: ```txt dbug: ServiceStack.OrmLiteLog[0] [PostgresDbJobsProvider] SQL: SELECT "Id", "ParentId", "RefId", "Worker", "Tag", "BatchId", "Callback", "DependsOn", "RunAfter", "CreatedDate", "CreatedBy", "RequestId", "RequestType", "Command", "Request", "RequestBody", "UserId", "Response", "ResponseBody", "State", "StartedDate", "CompletedDate", "NotifiedDate", "RetryLimit", "Attempts", "DurationMs", "TimeoutSecs", "Progress", "Status", "Logs", "LastActivityDate", "ReplyTo", "ErrorCode", "Error", "Args", "Meta" FROM "BackgroundJob" WHERE ("State" = :0) PARAMS: :0=Cancelled dbug: ServiceStack.OrmLiteLog[0] TIME: 1.818m ``` #### DB Command Execution Timing OrmLite's debug logging now also includes the elapsed time it took to execute the command which is also available on the `IDbCommand` `GetTag()` and `GetElapsedTime()` APIs, e.g: ```csharp OrmLiteConfig.AfterExecFilter = cmd => { Console.WriteLine($"[{cmd.GetTag()}] {cmd.GetElapsedTime()}"); }; ``` ### ExistsById APIs New `ExistsById` APIs for checking if a row exists for a given Id: ```csharp db.ExistsById(1); await db.ExistsByIdAsync(1); // Alternative to: db.Exists(x => x.Id == 1); await db.ExistsAsync(x => x.Id == 1); ``` ### ResetSequence for PostgreSQL The `ResetSequence` API is available to reset a Table's Id sequence in Postgres: ```csharp db.ResetSequence(x => x.Id); ``` #### Data Import example using BulkInsert This is useful to reset a PostgreSQL Table's auto-incrementing sequence when re-importing a dataset from a different database, e.g: ```csharp db.DeleteAll(); db.ResetSequence(x => x.Id); db.DeleteAll(); db.ResetSequence(x => x.Id); var config = new BulkInsertConfig { Mode = BulkInsertMode.Sql }; db.BulkInsert(dbSqlite.Select().OrderBy(x => x.Id), config); db.BulkInsert(dbSqlite.Select().OrderBy(x => x.Id), config); ``` ### New SqlDateFormat and SqlChar Dialect APIs The SQL Dialect functions provide an RDBMS agnostic way to call SQL functions that differs among different RDBMS's. The `DateFormat` accepts [SQLite strftime() function](https://www.w3resource.com/sqlite/sqlite-strftime.php) date and time modifiers in its format string whilst the `Char` accepts a character code, e.g: ```csharp var q = db.From(); var createdDate = q.Column(c => c.CreatedDate); var months = db.SqlColumn(q .Select(x => new { Month = q.sql.DateFormat(createdDate, "%Y-%m"), Log = q.sql.Concat(new[]{ "'Prefix'", q.sql.Char(10), createdDate }) })); ``` When executed in PostgreSQL it would generate: ```sql SELECT TO_CHAR("CreatedDate", 'YYYY-MM'), 'Prefix' || CHR(10) || "CreatedDate" FROM "CompletedJob" ``` ## RDBMS Async Tasks Builder ### Sequential Async DB Access Async improves I/O thread utilization in multi-threaded apps like Web Servers. However, it doesn't improve the performance of individual API Requests that need to execute multiple independent DB Requests. These are often written to run async db access sequentially like this: ```csharp var rockstars = await Db.SelectAsync(); var albums = await Db.SelectAsync(); var departments = await Db.SelectAsync(); var employees = await Db.SelectAsync(); ``` The issue being that it's not running them in parallel as each DB Request is executed sequentially with the Request for Albums not starting until the Request for Rockstars has completed. To run them in parallel you would need to open multiple scoped DB Connections, await them concurrently then do the syntax boilerplate gymnastics required to extract the generic typed results, e.g: ```csharp var connections = await Task.WhenAll( DbFactory.OpenDbConnectionAsync(), DbFactory.OpenDbConnectionAsync(), DbFactory.OpenDbConnectionAsync(), DbFactory.OpenDbConnectionAsync() ); using var dbRockstars = connections[0]; using var dbAlbums = connections[1]; using var dbDepartments = connections[2]; using var dbEmployees = connections[3]; var tasks = new List { dbRockstars.SelectAsync(), dbAlbums.SelectAsync(), dbDepartments.SelectAsync(), dbEmployees.SelectAsync() }; await Task.WhenAll(tasks); var rockstars = ((Task>)tasks[0]).Result; var albums = ((Task>)tasks[1]).Result; var departments = ((Task>)tasks[2]).Result; var employees = ((Task>)tasks[3]).Result; ``` Even without Error handling, writing coding like this can quickly become tedious, less readable and error prone that as a result is rarely done. ### Parallel DB Requests in TypeScript This is easier to achieve in languages like TypeScript where typed ORMs like [litdb.dev](https://litdb.dev) can run multiple DB Requests in parallel with just: ```csharp const [rockstars, albums, departments, employees] = await Promise.all([ db.all($.from(Rockstar)), //= Rockstar[] db.all($.from(Album)), //= Album[] db.all($.from(Department)), //= Department[] db.all($.from(Employee)), //= Employee[] ]) ``` Which benefits from TypeScript's powerful type system that allows destructuring arrays whilst preserving their positional types, whilst its single threaded event loop lets you reuse the same DB Connection to run DB Requests in parallel without multi-threading issues. ## OrmLite's new Async Tasks Builder OrmLite's new `AsyncDbTasksBuilder` provides a similar benefit of making it effortless to run multiple async DB Requests in parallel, which looks like: ```csharp var results = await DbFactory.AsyncDbTasksBuilder() .Add(db => db.SelectAsync()) .Add(db => db.SelectAsync()) .Add(db => db.SelectAsync()) .Add(db => db.SelectAsync()) .RunAsync(); var (rockstars, albums, departments, employees) = results; ``` Which just like TypeScript's destructuring returns a positionally typed tuple of the results which can be destructured back into their typed variables, e.g: ```csharp (List rockstars, List albums, List departments, List employees) = results; ``` ### Supports up to 8 Tasks It allows chaining up to **8 async Tasks in parallel** as C#'s Type System doesn't allow for preserving different positional generic types in an unbounded collection. Instead each Task returns a new Generic Type builder which preserves the positional types before it. ### Supports both Async `Task` and `Task` APIs Where `Task` and `Task` APIs can be mixed and matched interchangeably: ```csharp var builder = DbFactory.AsyncDbTasksBuilder() .Add(db => db.InsertAsync(rockstars[0],rockstars[1])) .Add(db => db.SelectAsync()) .Add(db => db.InsertAsync(albums[2],albums[3])) .Add(db => db.SelectAsync()) .Add(db => db.InsertAsync([department])) .Add(db => db.SelectAsync()) .Add(db => db.InsertAsync([employee])) .Add(db => db.SelectAsync()); ``` Where to preserve the results chain, `Task` APIs return `bool` results, e.g: ```csharp (bool r1, List r2, bool r3, List r4, bool r5, List r6, bool r7, List r8) = await builder.RunAsync(); ``` ### Error Handling Whilst tasks are executed in parallel when added, any Exceptions are only thrown when the task is awaited: ```csharp using var Db = await OpenDbConnectionAsync(); var builder = DbFactory.AsyncDbTasksBuilder() .Add(db => db.InsertAsync(rockstars[0])) .Add(db => db.InsertAsync(rockstars[0])); // <-- Duplicate PK Exception // Exceptions are not thrown until the task is awaited try { var task = builder.RunAsync(); } catch (Exception e) { throw; } ``` ## RDBMS Background Jobs We're excited to announce that we've ported our much loved [Background Jobs](https://docs.servicestack.net/background-jobs) feature for SQLite to the popular **PostgreSQL**, **SQL Server** and **MySQL** RDBMS's. Whilst we love [SQLite + Litestream](https://docs.servicestack.net/ormlite/litestream) for its low dev ops maintenance allowing us to break free from [expensive cloud hosting hosts](https://docs.servicestack.net/ormlite/litestream#the-right-time-for-server-side-sqlite) for managed RDBMS's, it's clear many of our Customers need the features of an industrial strength RDBMS. In future we'll also be looking at providing a great self-hosted manged solution for Customers that can be run free of expensive cloud hosting costs (starting with PostgreSQL). Before we can focus on this we needed to rewrite all our SQLite-only features to work with OrmLite's other premier supported RDBMS's. The new **DatabaseJobFeature** is a new implementation purpose built for PostgreSQL, SQL Server and MySQL backends that's a drop-in replacement for SQLite's **BackgroundsJobFeature** which can be applied to an existing .NET 8+ project by [mixing in](https://docs.servicestack.net/mix-tool) the **db-identity** or **db-jobs** gist files to your host project. ### Install For [ServiceStack ASP.NET Identity Auth](https://servicestack.net/start) Projects: :::sh x mix db-identity ::: Which replaces `Configure.BackgroundJobs.cs` and `Configure.RequestLogs.cs` with an equivalent version that uses the new `DatabaseJobFeature` for sending Application Emails and `DbRequestLogger` for API Request Logging. All other .NET 8+ ServiceStack Apps should instead use: :::sh x mix db-jobs ::: Which replaces `Configure.BackgroundJobs.cs` to use `DatabaseJobFeature`: ```csharp public class ConfigureBackgroundJobs : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new CommandsFeature()); services.AddPlugin(new DatabaseJobFeature { // NamedConnection = "" }); services.AddHostedService(); }).ConfigureAppHost(afterAppHostInit: appHost => { var services = appHost.GetApplicationServices(); var jobs = services.GetRequiredService(); // Example of registering a Recurring Job to run Every Hour //jobs.RecurringCommand(Schedule.Hourly); }); } public class JobsHostedService(ILogger log, IBackgroundJobs jobs) : BackgroundService { protected override async Task ExecuteAsync(CancellationToken stoppingToken) { await jobs.StartAsync(stoppingToken); using var timer = new PeriodicTimer(TimeSpan.FromSeconds(3)); while (!stoppingToken.IsCancellationRequested && await timer.WaitForNextTickAsync(stoppingToken)) { await jobs.TickAsync(); } } } ``` Fortunately we were able reuse the same `IBackgroundJobs` interface, Data Models, and API Service Contracts which greatly simplifies any migration efforts from SQLite's **ServiceStack.Jobs** implementation. By implementing the same API Service Contracts (i.e. Request/Response DTOs) we're also able to reuse the same [built-in](/auto-ui) Management UI to provide real-time monitoring, inspection and management of background jobs: :::youtube 2Cza_a_rrjA Durable C# Background Jobs and Scheduled Tasks for .NET ::: ## RDBMS Optimizations A key benefit of using SQLite for Background Jobs was the ability to easily maintain completed and failed job history in separate **monthly databases**. This approach prevented the main application database from growing unbounded by archiving historical job data into isolated monthly SQLite database files (e.g., `jobs_2025-01.db`, `jobs_2025-02.db`). These monthly databases could be easily backed up, archived to cold storage, or deleted after a retention period, providing a simple yet effective data lifecycle management strategy. For the new **DatabaseJobFeature** supporting PostgreSQL, SQL Server, and MySQL, we've replicated this monthly partitioning strategy using **monthly partitioned SQL tables** for the `CompletedJob` and `FailedJob` archive tables. ### PostgreSQL - Native Table Partitioning PostgreSQL provides native support for table partitioning, allowing us to automatically create monthly partitions using `PARTITION BY RANGE` on the `CreatedDate` column. The `DatabaseJobFeature` automatically creates new monthly partitions as needed, maintaining the same logical separation as SQLite's monthly .db's while keeping everything within a single Postgres DB: ```sql CREATE TABLE CompletedJob ( -- columns... CreatedDate TIMESTAMP NOT NULL, PRIMARY KEY ("Id","CreatedDate") ) PARTITION BY RANGE ("CreatedDate"); -- Monthly partitions are automatically created, e.g.: CREATE TABLE CompletedJob_2025_01 PARTITION OF CompletedJob FOR VALUES FROM ('2025-01-01') TO ('2025-02-01'); ``` This provides excellent query performance since PostgreSQL can use partition pruning to only scan relevant monthly partitions when filtering by `CreatedDate`. ### SQLServer / MySQL - Manual Partition Management For **SQL Server** and **MySQL**, monthly partitioned tables need to be created **out-of-band** (either manually or via cronjob scripts) since they don't support the same level of automatic partition management as PostgreSQL. However, this still works well in practice as it uses: 1. **Write-Only Tables** - The `CompletedJob` and `FailedJob` tables are write-only append tables. Jobs are never updated after completion or failure, only inserted. 2. **CreatedDate Index** - All queries against these tables use the `CreatedDate` indexed column for filtering and sorting, ensuring efficient access patterns even as the tables grow. The indexed `CreatedDate` column ensures that queries remain performant regardless of table size, and the write-only nature means there's no complex update logic to manage across partitions. This approach maintains the same benefits as SQLite's monthly databases - easy archival, manageable table sizes, and efficient queries - while leveraging the scalability and features of enterprise RDBMS systems. ### Separate Jobs Database Or if preferred, you can maintain background jobs in a **separate database** from your main application database. This separation keeps the write-heavy job processing load off your primary database, allowing you to optimize each database independently for its specific workload patterns like maintaining different backup strategies for your critical application data vs. job history. ```csharp // Configure.Db.cs services.AddOrmLite(options => options.UsePostgres(connectionString)) .AddPostgres("jobs", jobsConnectionString); // Configure.BackgroundJobs.cs services.AddPlugin(new DatabaseJobFeature { NamedConnection = "jobs" }); ``` ### Real Time Admin UI The Jobs Admin UI provides a real time view into the status of all background jobs including their progress, completion times, Executed, Failed, and Cancelled Jobs, etc. which is useful for monitoring and debugging purposes. [![](/img/pages/jobs/jobs-dashboard.webp)](/img/pages/jobs/jobs-dashboard.webp) View Real-time progress of queued Jobs [![](/img/pages/jobs/jobs-queue.webp)](/img/pages/jobs/jobs-queue.webp) View real-time progress logs of executing Jobs [![](/img/pages/jobs/jobs-logs.webp)](/img/pages/jobs/jobs-logs.webp) View Job Summary and Monthly Databases of Completed and Failed Jobs [![](/img/pages/jobs/jobs-completed.webp)](/img/pages/jobs/jobs-completed.webp) View full state and execution history of each Job [![](/img/pages/jobs/jobs-failed.webp)](/img/pages/jobs/jobs-failed.webp) Cancel Running jobs and Requeue failed jobs ## Usage For even greater reuse of your APIs you're able to queue your existing ServiceStack Request DTOs as a Background Job in addition to [Commands](https://docs.servicestack.net/commands) for encapsulating units of logic into internal invokable, inspectable and auto-retryable building blocks. ### Queue Commands Any API, Controller or Minimal API can execute jobs with the `IBackgroundJobs` dependency, e.g. here's how you can run a background job to send a new email when an API is called in any new Identity Auth template: ```csharp class MyService(IBackgroundJobs jobs) : Service { public object Any(MyOrder request) { var jobRef = jobs.EnqueueCommand(new SendEmail { To = "my@email.com", Subject = $"Received New Order {request.Id}", BodyText = $""" Order Details: {request.OrderDetails.DumptTable()} """, }); //... } } ``` Which records and immediately executes a worker to execute the `SendEmailCommand` with the specified `SendEmail` Request argument. It also returns a reference to a Job which can be used later to query and track the execution of a job. ### Queue APIs Alternatively a `SendEmail` API could be executed with just the Request DTO: ```csharp var jobRef = jobs.EnqueueApi(new SendEmail { To = "my@email.com", Subject = $"Received New Order {request.Id}", BodyText = $""" Order Details: {request.OrderDetails.DumptTable()} """, }); ``` Although Sending Emails is typically not an API you want to make externally available and would want to [Restrict access](https://docs.servicestack.net/auth/restricting-services) or [limit usage to specified users](https://docs.servicestack.net/auth/identity-auth#declarative-validation-attributes). In both cases the `SendEmail` Request is persisted into the Jobs SQLite database for durability that gets updated as it progresses through the queue. For execution the API or command is resolved from the IOC before being invoked with the Request. APIs are executed via the [MQ Request Pipeline](https://docs.servicestack.net/order-of-operations) and commands executed using the [Commands Feature](https://docs.servicestack.net/commands) where they'll also be visible in the [Commands Admin UI](https://docs.servicestack.net/commands#command-admin-ui). ### Background Job Options The behavior for each `Enqueue*` method for executing background jobs can be customized with the following options: - `Worker` - Serially process job using a named worker thread - `Callback` - Invoke another command with the result of a successful job - `DependsOn` - Execute jobs after successful completion of a dependent job - If parent job fails all dependent jobs are cancelled - `UserId` - Execute within an Authenticated User Context - `RunAfter` - Queue jobs that are only run after a specified date - `RetryLimit` - Override default retry limit for how many attempts should be made to execute a job - `TimeoutSecs` - Override default timeout for how long a job should run before being cancelled - `RefId` - Allow clients to specify a unique Id (e.g Guid) to track job - `Tag` - Group related jobs under a user specified tag - `CreatedBy` - Optional field for capturing the owner of a job - `BatchId` - Group multiple jobs with the same Id - `ReplyTo` - Optional field for capturing where to send notification for completion of a Job - `Args` - Optional String Dictionary of Arguments that can be attached to a Job ### Feature Overview It packs most features needed in a Background Jobs solution including: - Use your App's existing RDBMS (no other infrastructure dependencies) - Execute existing APIs or versatile Commands - Commands auto registered in IOC - Scheduled Reoccurring Tasks - Track Last Job Run - Serially execute jobs with the same named Worker - Queue Jobs dependent on successful completion of parent Job - Queue Jobs to be executed after a specified Date - Execute Jobs within the context of an Authenticated User - Auto retry failed jobs on a default or per-job limit - Timeout Jobs on a default or per-job limit - Cancellable Jobs - Requeue Failed Jobs - Execute custom callbacks on successful execution of Job - Maintain Status, Logs, and Progress of Executing Jobs - Execute transitive (i.e. non-durable) jobs using named workers - Attach optional `Tag`, `BatchId`, `CreatedBy`, `ReplyTo` and `Args` with Jobs ## Schedule Recurring Tasks In addition to queueing jobs to run in the background, it also supports scheduling recurring tasks to execute APIs or Commands at fixed intervals. :::youtube DtB8KaXXMCM Schedule your Reoccurring Tasks with Background Jobs! ::: APIs and Commands can be scheduled to run at either a `TimeSpan` or [CRON Expression](https://github.com/HangfireIO/Cronos?tab=readme-ov-file#cron-format) interval, e.g: ## CRON Expression Examples ```csharp // Every Minute Expression jobs.RecurringCommand(Schedule.Cron("* * * * *")); // Every Minute Constant jobs.RecurringCommand(Schedule.EveryMinute, new CheckUrls { Urls = urls }); ``` ### CRON Format You can use any **unix-cron format** expression supported by the [HangfireIO/Cronos](https://github.com/HangfireIO/Cronos) library: ```txt |------------------------------- Minute (0-59) | |------------------------- Hour (0-23) | | |------------------- Day of the month (1-31) | | | |------------- Month (1-12; or JAN to DEC) | | | | |------- Day of the week (0-6; or SUN to SAT) | | | | | | | | | | * * * * * ``` The allowed formats for each field include: | Field | Format of valid values | |------------------|--------------------------------------------| | Minute | 0-59 | | Hour | 0-23 | | Day of the month | 1-31 | | Month | 1-12 (or JAN to DEC) | | Day of the week | 0-6 (or SUN to SAT; or 7 for Sunday) | #### Matching all values To match all values for a field, use the asterisk: `*`, e.g here are two examples in which the minute field is left unrestricted: - `* 0 1 1 1` - the job runs every minute of the midnight hour on January 1st and Mondays. - `* * * * *` - the job runs every minute (of every hour, of every day of the month, of every month, every day of the week, because each of these fields is unrestricted too). #### Matching a range To match a range of values, specify your start and stop values, separated by a hyphen (-). Do not include spaces in the range. Ranges are inclusive. The first value must be less than the second. The following equivalent examples run at midnight on Mondays, Tuesdays, Wednesdays, Thursdays, and Fridays (for all months): - `0 0 * * 1-5` - `0 0 * * MON-FRI` #### Matching a list Lists can contain any valid value for the field, including ranges. Specify your values, separated by a comma (,). Do not include spaces in the list, e.g: - `0 0,12 * * *` - the job runs at midnight and noon. - `0-5,30-35 * * * *` - the job runs in each of the first five minutes of every half hour (at the top of the hour and at half past the hour). ### TimeSpan Interval Examples ```csharp jobs.RecurringCommand( Schedule.Interval(TimeSpan.FromMinutes(1))); // With Example jobs.RecurringApi(Schedule.Interval(TimeSpan.FromMinutes(1)), new CheckUrls { Urls = urls }); ``` That can be registered with an optional **Task Name** and **Background Options**, e.g: ```csharp jobs.RecurringCommand("Check URLs", Schedule.EveryMinute, new() { RunCommand = true // don't persist job }); ``` :::info If no name is provided, the Command's Name or APIs Request DTO will be used ::: ### Idempotent Registration Scheduled Tasks are idempotent where the same registration with the same name will either create or update the scheduled task registration without losing track of the last time the Recurring Task, as such it's recommended to always define your App's Scheduled Tasks on Startup: ```csharp public class ConfigureBackgroundJobs : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context,services) => { //... }).ConfigureAppHost(afterAppHostInit: appHost => { var services = appHost.GetApplicationServices(); var jobs = services.GetRequiredService(); // App's Scheduled Tasks Registrations: jobs.RecurringCommand(Schedule.Hourly); }); } ``` ## Interned Cronos A major source of friction in .NET Libraries and most Frameworks from all platforms in general is dependency conflicts. E.g. Conflicting versions of JSON.NET have plagued many a .NET library and framework for several years, something that never impacted ServiceStack Apps since we maintain our own fast/flexible JSON Serializer and have never had a dependency to JSON.NET. As supply chain attacks from external OSS libraries have become more common, it's even more important to avoid taking dependencies on external libraries where possible. As we now have multiple packages that referenced [Hangfire's Cronos](https://github.com/HangfireIO/Cronos) library we've decided to intern it in ServiceStack, removing the previous dependency **ServiceStack.Jobs** had to Cronos. The only issue was that [CronParser.cs](https://github.com/HangfireIO/Cronos/blob/main/src/Cronos/CronParser.cs) uses unsafe parsing and we don't allow `` in any ServiceStack package, so it was rewritten to use Spans in our interned [CronParser.cs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.Common/Cronos/CronParser.cs) implementation. It's released under the same MIT License as Cronos so anyone else is welcome to use it, as is our port of their [CronExpressionTests.cs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/tests/ServiceStack.Common.Tests/CronExpressionTests.cs) to NUnit. ## Background Jobs Admin UI The last job the Recurring Task ran is also viewable in the Jobs Admin UI: [![](/img/pages/jobs/jobs-scheduled-tasks-last-job.webp)](/img/pages/jobs/jobs-scheduled-tasks-last-job.webp) ### Executing non-durable jobs `IBackgroundJobs` also supports `RunCommand*` methods for executing background jobs transiently (i.e. non-durable), which is useful for commands that want to be serially executed by a named worker but don't need to be persisted. #### Execute in Background and return immediately You could use this to queue system emails to be sent by the same **smtp** worker and are happy to not have its state and execution history tracked in the Jobs database. ```csharp var job = jobs.RunCommand(new SendEmail { ... }, new() { Worker = "smtp" }); ``` In this case `RunCommand` returns the actual `BackgroundJob` instance that will be updated by the worker. #### Execute in Background and wait for completion You can also use `RunCommandAsync` if you prefer to wait until the job has been executed. Instead of a Job it returns the **Result** of the command if it returned one. ```csharp var result = await jobs.RunCommandAsync(new SendEmail {...}, new() { Worker = "smtp" }); ``` ### Serially Execute Jobs with named Workers By default jobs are executed immediately in a new Task, we can also change the behavior to instead execute jobs one-by-one in a serial queue by specifying them to use the same named worker as seen in the example above. Alternatively you can annotate the command with the `[Worker]` attribute if you **always** want all jobs executing the command to use the same worker: ```csharp [Worker("smtp")] public class SendEmailCommand(IBackgroundJobs jobs) : SyncCommand { //... } ``` ### Use Callbacks to process the results of Commands Callbacks can be used to extend the lifetime of a job to include processing a callback to process its results. This is useful where you would like to reuse the the same command but handle the results differently, e.g. the same command can email results or invoke a webhook by using a callback: ```csharp jobs.EnqueueCommand(new CheckUrls { Urls = allUrls }, new() { Callback = nameof(EmailUrlResultsCommand), }); jobs.EnqueueCommand(new CheckUrls { Urls = criticalUrls }, new() { Callback = nameof(WebhookUrlResultsCommand), ReplyTo = callbackUrl }); ``` Callbacks that fail are auto-retried the same number of times as their jobs, which if they all fail then the entire job is also marked as failed. ### Run Job dependent on successful completion of parent Jobs can be queued to only run after the successful completion of another job, this is useful for when you need to kick off multiple jobs after a long running task has finished like generating monthly reports after monthly data has been aggregated, e.g: ```csharp var jobRef = jobs.EnqueueCommand(new Aggregate { Month = DateTime.UtcNow }); jobs.EnqueueCommand(new () { DependsOn = jobRef.Id, }); jobs.EnqueueCommand(new () { DependsOn = jobRef.Id, }); ``` Inside your command you can get a reference to your current job with `Request.GetBackgroundJob()` which will have its `ParentId` populated with the parent job Id and `job.ParentJob` containing a reference to the completed Parent Job where you can access its Request, Results, and other job information: ```csharp public class GenerateSalesReportCommand(ILogger log) : SyncCommand { protected override void Run() { var job = Request.GetBackgroundJob(); var parentJob = job.ParentJob; } } ``` ### Atomic Batching Behavior We can also use `DependsOn` to implement atomic batching behavior where from inside our executing command we can queue new jobs that are dependent on the successful execution of the current job, e.g: ```csharp public class CheckUrlsCommand(IHttpClientFactory factory, IBackgroundJobs jobs) : AsyncCommand { protected override async Task RunAsync(CheckUrls req, CancellationToken ct) { var job = Request.GetBackgroundJob(); var batchId = Guid.NewGuid().ToString("N"); using var client = factory.CreateClient(); foreach (var url in req.Urls) { var msg = new HttpRequestMessage(HttpMethod.Get, url); var response = await client.SendAsync(msg, ct); response.EnsureSuccessStatusCode(); jobs.EnqueueCommand(new SendEmail { To = "my@email.com", Subject = $"{new Uri(url).Host} status", BodyText = $"{url} is up", }, new() { DependsOn = job.Id, BatchId = batchId, }); } } } ``` Where any dependent jobs are only executed if the job was successfully completed. If instead an exception was thrown during execution, the job will be failed and all its dependent jobs cancelled and removed from the queue. ### Executing jobs with an Authenticated User Context If you have existing logic dependent on a Authenticated `ClaimsPrincipal` or ServiceStack `IAuthSession` you can have your APIs and Commands also be executed with that user context by specifying the `UserId` the job should be executed as: ```csharp var openAiRequest = new CreateOpenAiChat { Request = new() { Model = "gpt-4", Messages = [ new() { Content = request.Question } ] }, }; // Example executing API Job with User Context jobs.EnqueueApi(openAiRequest, new() { UserId = Request.GetClaimsPrincipal().GetUserId(), CreatedBy = Request.GetClaimsPrincipal().GetUserName(), }); // Example executing Command Job with User Context jobs.EnqueueCommand(openAiRequest, new() { UserId = Request.GetClaimsPrincipal().GetUserId(), CreatedBy = Request.GetClaimsPrincipal().GetUserName(), }); ``` Inside your API or Command you access the populated User `ClaimsPrincipal` or ServiceStack `IAuthSession` using the same APIs that you'd use inside your ServiceStack APIs, e.g: ```csharp public class CreateOpenAiChatCommand(IBackgroundJobs jobs) : AsyncCommand { protected override async Task RunAsync( CreateOpenAiChat request, CancellationToken token) { var user = Request.GetClaimsPrincipal(); var session = Request.GetSession(); //... } } ``` ### Queue Job to run after a specified date Using `RunAfter` lets you queue jobs that are only executed after a specified `DateTime`, useful for executing resource intensive tasks at low traffic times, e.g: ```csharp var jobRef = jobs.EnqueueCommand(new Aggregate { Month = DateTime.UtcNow }, new() { RunAfter = DateTime.UtcNow.Date.AddDays(1) }); ``` ### Attach Metadata to Jobs All above Background Job Options have an effect on when and how Jobs are executed. There are also a number of properties that can be attached to a Job that can be useful in background job processing despite not having any effect on how jobs are executed. These properties can be accessed by commands or APIs executing the Job and are visible and can be filtered in the Jobs Admin UI to help find and analyze executed jobs. ```csharp var jobRef = jobs.EnqueueCommand(openAiRequest, new() { // Group related jobs under a common tag Tag = "ai", // A User-specified or system generated unique Id to track the job RefId = request.RefId, // Capture who created the job CreatedBy = Request.GetClaimsPrincipal().GetUserName(), // Link jobs together that are sent together in a batch BatchId = batchId, // Capture where to notify the completion of the job to ReplyTo = "https:example.org/callback", // Additional properties about the job that aren't in the Request Args = new() { ["Additional"] = "Metadata" } }); ``` ### Querying a Job A job can be queried by either it's auto-incrementing `Id` Primary Key or by a unique `RefId` that can be user-specified. ```csharp var jobResult = jobs.GetJob(jobRef.Id); var jobResult = jobs.GetJobByRefId(jobRef.RefId); ``` At a minimum a `JobResult` will contain the Summary Information about a Job as well as the full information about a job depending on where it's located: ```csharp class JobResult { // Summary Metadata about a Job in the JobSummary Table JobSummary Summary // Job that's still in the BackgroundJob Queue BackgroundJob? Queued // Full Job information in Monthly DB CompletedJob Table CompletedJob? Completed // Full Job information in Monthly DB FailedJob Table FailedJob? Failed // Helper to access full Job Information BackgroundJobBase? Job => Queued ?? Completed ?? Failed } ``` ### Job Execution Limits Default Retry and Timeout Limits can be configured on the `DatabaseJobFeature`: ```csharp services.AddPlugin(new DatabaseJobFeature { DefaultRetryLimit = 2, DefaultTimeout = TimeSpan.FromMinutes(10), }); ``` These limits are also overridable on a per-job basis, e.g: ```csharp var jobRef = jobs.EnqueueCommand(new Aggregate { Month = DateTime.UtcNow }, new() { RetryLimit = 3, Timeout = TimeSpan.FromMinutes(30), }); ``` ### Logging, Cancellation an Status Updates We'll use the command for checking multiple URLs to demonstrate some recommended patterns and how to enlist different job processing features. ```csharp public class CheckUrlsCommand( ILogger logger, IBackgroundJobs jobs, IHttpClientFactory clientFactory) : AsyncCommand { protected override async Task RunAsync(CheckUrls req, CancellationToken ct) { // 1. Create Logger that Logs and maintains logging in Jobs DB var log = Request.CreateJobLogger(jobs,logger); // 2. Get Current Executing Job var job = Request.GetBackgroundJob(); var result = new CheckUrlsResult { Statuses = new() }; using var client = clientFactory.CreateClient(); for (var i = 0; i < req.Urls.Count; i++) { // 3. Stop processing Job if it's been cancelled ct.ThrowIfCancellationRequested(); var url = req.Urls[i]; try { var msg = new HttpRequestMessage(HttpMethod.Get,url); var response = await client.SendAsync(msg, ct); result.Statuses[url] = response.IsSuccessStatusCode; log.LogInformation("{Url} is {Status}", url, response.IsSuccessStatusCode ? "up" : "down"); // 4. Optional: Maintain explicit progress and status updates log.UpdateStatus(i/(double)req.Urls.Count,$"Checked {i} URLs"); } catch (Exception e) { log.LogError(e, "Error checking {Url}", url); result.Statuses[url] = false; } } // 5. Send Results to WebHook Callback if specified if (job.ReplyTo != null) { jobs.EnqueueCommand(result, new() { ParentId = job.Id, ReplyTo = job.ReplyTo, }); } } } ``` We'll cover some of the notable parts useful when executing Jobs: #### 1. Job Logger We can use a Job logger to enable database logging that can be monitored in real-time in the Admin Jobs UI. Creating it with both `BackgroundJobs` and `ILogger` will return a combined logger that both Logs to standard output and to the Jobs database: ```csharp var log = Request.CreateJobLogger(jobs,logger); ``` Or just use `Request.CreateJobLogger(jobs)` to only save logs to the database. #### 2. Resolve Executing Job If needed the currently executing job can be accessed with: ```csharp var job = Request.GetBackgroundJob(); ``` Where you'll be able to access all the metadata the jobs were created with including `ReplyTo` and `Args`. #### 3. Check if Job has been cancelled To be able to cancel a long running job you'll need to periodically check if a Cancellation has been requested and throw a `TaskCanceledException` if it has to short-circuit the command which can be done with: ```csharp ct.ThrowIfCancellationRequested(); ``` You'll typically want to call this at the start of any loops to prevent it from doing any more work. #### 4. Optionally record progress and status updates By default Background Jobs looks at the last API or Command run and worker used to estimate the duration and progress for how long a running job will take. If preferred your command can explicitly set a more precise progress and optional status update that should be used instead, e.g: ```csharp log.UpdateStatus(progress:i/(double)req.Urls.Count, $"Checked {i} URLs"); ``` Although generally the estimated duration and live logs provide a good indication for the progress of a job. #### 5. Notify completion of Job Calling a Web Hook is a good way to notify externally initiated job requests of the completion of a job. You could invoke the callback within the command itself but there are a few benefits to initiating another job to handle the callback: - Frees up the named worker immediately to process the next task - Callbacks are durable, auto-retried and their success recorded like any job - If a callback fails the entire command doesn't need to be re-run again We can queue a callback with the result by passing through the `ReplyTo` and link it to the existing job with: ```csharp if (job.ReplyTo != null) { jobs.EnqueueCommand(result, new() { ParentId = job.Id, ReplyTo = job.ReplyTo, }); } ``` Which we can implement by calling the `SendJsonCallbackAsync` extension method with the Callback URL and the Result DTO it should be called with: ```csharp public class NotifyCheckUrlsCommand(IHttpClientFactory clientFactory) : AsyncCommand { protected override async Task RunAsync( CheckUrlsResult request, CancellationToken token) { await clientFactory.SendJsonCallbackAsync( Request.GetBackgroundJob().ReplyTo, request, token); } } ``` #### Callback URLs `ReplyTo` can be any URL which by default will have the result POST'ed back to the URL with a JSON Content-Type. Typically URLs will contain a reference Id so external clients can correlate a callback with the internal process that initiated the job. If the callback API is publicly available you'll want to use an internal Id that can't be guessed so the callback can't be spoofed, like a Guid, e.g: :::copy `https://api.example.com?refId={RefId}` ::: If needed the callback URL can be customized on how the HTTP Request callback is sent. You can change the HTTP Method used by including it before the URL: :::copy `PUT https://api.example.com` ::: If the auth part contains a colon `:` it's treated as Basic Auth: :::copy `username:password@https://api.example.com` ::: If name starts with `http.` sends a HTTP Header :::copy `http.X-API-Key:myApiKey@https://api.example.com` ::: Otherwise it's sent as a Bearer Token: :::copy `myToken123@https://api.example.com` ::: Bearer Token or HTTP Headers starting with `$` is substituted with Environment Variable if exists: :::copy `$API_TOKEN@https://api.example.com` ::: When needed headers, passwords, and tokens can be URL encoded if they contain any delimiter characters. ## Implementing Commands At a minimum a command need only implement the [IAsyncCommand interface](https://docs.servicestack.net/commands#commands-feature): ```csharp public interface IAsyncCommand { Task ExecuteAsync(T request); } ``` Which is the singular interface that can execute any command. However commands executed via Background Jobs have additional context your commands may need to access during execution, including the `BackgroundJob` itself, the `CancellationToken` and an Authenticated User Context. To reduce the effort in creating commands with a `IRequest` context we've added a number ergonomic base classes to better capture the different call-styles a unit of logic can have including **Sync** or **Async** execution, whether they require **Input Arguments** or have **Result Outputs**. Choosing the appropriate Abstract base class benefits from IDE tooling in generating the method signature that needs to be implemented whilst Async commands with Cancellation Tokens in its method signature highlights any missing async methods that are called without the token. ### Sync Commands - `SyncCommand` - Requires No Arguments - `SyncCommand` - Requires TRequest Argument - `SyncCommandWithResult` - Requires No Args and returns Result - `SyncCommandWithResult` - Requires Arg and returns Result ```csharp public record MyArgs(int Id); public record MyResult(string Message); public class MyCommandNoArgs(ILogger log) : SyncCommand { protected override void Run() { log.LogInformation("Called with No Args"); } } public class MyCommandArgs(ILogger log) : SyncCommand { protected override void Run(MyArgs request) { log.LogInformation("Called with {Id}", request.Id); } } public class MyCommandWithResult(ILogger log) : SyncCommandWithResult { protected override MyResult Run() { log.LogInformation("Called with No Args and returns Result"); return new MyResult("Hello World"); } } public class MyCommandWithArgsAndResult(ILogger log) : SyncCommandWithResult { protected override MyResult Run(MyArgs request) { log.LogInformation("Called with {Id} and returns Result", request.Id); return new MyResult("Hello World"); } } ``` ### Async Commands - `AsyncCommand` - Requires No Arguments - `AsyncCommand` - Requires TRequest Argument - `AsyncCommandWithResult` - Requires No Args and returns Result - `AsyncCommandWithResult` - Requires Arg and returns Result ```csharp public class MyAsyncCommandNoArgs(ILogger log) : AsyncCommand { protected override async Task RunAsync(CancellationToken token) { log.LogInformation("Async called with No Args"); } } public class MyAsyncCommandArgs(ILogger log) : AsyncCommand { protected override async Task RunAsync(MyArgs request, CancellationToken t) { log.LogInformation("Async called with {Id}", request.Id); } } public class MyAsyncCommandWithResult(ILogger log) : AsyncCommandWithResult { protected override async Task RunAsync(CancellationToken token) { log.LogInformation("Async called with No Args and returns Result"); return new MyResult("Hello World"); } } public class MyAsyncCommandWithArgsAndResult(ILogger log) : AsyncCommandWithResult { protected override async Task RunAsync( MyArgs request, CancellationToken token) { log.LogInformation("Called with {Id} and returns Result", request.Id); return new MyResult("Hello World"); } } ``` ## RDBMS Request Logging and Analytics This release also restores parity to **PostgreSQL**, **SQL Server** & **MySQL** RDBMS's for our previous SQLite-only features with the new `DbRequestLogger` which is a drop-in replacement for [SQLite Request Logging](https://docs.servicestack.net/sqlite-request-logs) for persisting API Request Logs to a RDBMS. Whilst maintaining an archive of API Requests is nice, the real value of DB Request Logging is that it unlocks the comprehensive API Analytics and querying Logging available that was previously limited to SQLite Request Logs. :::youtube kjLcm1llC5Y In Depth and Interactive API Analytics available to all ASP .NET Core ServiceStack Apps! ::: ### Benefits of API Analytics They provide deep and invaluable insight into your System API Usage, device distribution, its Users, API Keys and the IPs where most traffic generates: - **Visibility:** Provides a clear, visual summary of complex log data, making it easier to understand API usage and performance at a glance. - **Performance Monitoring:** Helps track key metrics like request volume and response times to ensure APIs are meeting performance expectations. - **User Understanding:** Offers insights into how users (and bots) are interacting with the APIs (devices, browsers). - **Troubleshooting:** Aids in quickly identifying trends, anomalies, or specific endpoints related to issues. - **Resource Planning:** Understanding usage patterns helps in scaling infrastructure appropriately. - **Security Insight:** Identifying bot traffic and unusual request patterns can be an early indicator of security concerns. ### Interactive Analytics Analytics are also interactive where you're able to drill down to monitor the activity of individual APIs, Users, API Keys and IPs which have further links back to the request logs which the summary analytics are derived from. As they offer significant and valuable insights the `SqliteRequestLogger` is built into all ASP.NET Core IdentityAuth templates, to switch it over to use a RDBMS we recommend installing `db-identity` mix gist to also replace SQLite BackgroundJobs with the RDBMS `DatabaseJobFeature`: :::sh x mix db-identity ::: Or if you just want to replace SQLite Request Logs with a RDBMS use: :::sh x mix db-requestlogs ::: Or you can copy the [Modular Startup](https://docs.servicestack.net/modular-startup) script below: ```csharp [assembly: HostingStartup(typeof(MyApp.ConfigureRequestLogs))] namespace MyApp; public class ConfigureRequestLogs : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { services.AddPlugin(new RequestLogsFeature { RequestLogger = new DbRequestLogger { // NamedConnection = "" }, EnableResponseTracking = true, EnableRequestBodyTracking = true, EnableErrorTracking = true }); services.AddHostedService(); if (context.HostingEnvironment.IsDevelopment()) { services.AddPlugin(new ProfilingFeature()); } }); } public class RequestLogsHostedService(ILogger log, IRequestLogger requestLogger) : BackgroundService { protected override async Task ExecuteAsync(CancellationToken stoppingToken) { using var timer = new PeriodicTimer(TimeSpan.FromSeconds(3)); if (requestLogger is IRequireAnalytics logger) { while (!stoppingToken.IsCancellationRequested && await timer.WaitForNextTickAsync(stoppingToken)) { await logger.TickAsync(log, stoppingToken); } } } } ``` ### RDBMS Provider When using a remote RDBMS, network latency becomes a primary concern that any solution needs to be designed around, as such the API Request Logs are initially maintained in an in memory collection before being flushed to the database **every 3 seconds** — configurable in the `PeriodicTimer` interval above. To reduce the number of round-trips to the database, the `DbRequestLogger` batches all pending logs into a single request using [OrmLite's Bulk Inserts](https://docs.servicestack.net/ormlite/bulk-inserts) which is supported by all major RDBMS's. ### PostgreSQL Table Partitioning PostgreSQL provides native support for table partitioning, allowing us to automatically create monthly partitions using `PARTITION BY RANGE` on the `CreatedDate` column. The `DbRequestLogger` automatically creates new monthly partitions as needed, maintaining the same logical separation as SQLite's monthly .db's while keeping everything within a single Postgres DB: ```sql CREATE TABLE "RequestLog" ( -- columns... "CreatedDate" TIMESTAMP NOT NULL, PRIMARY KEY ("Id","CreatedDate") ) PARTITION BY RANGE ("CreatedDate"); -- Monthly partitions are automatically created, e.g.: CREATE TABLE "RequestLog_2025_01" PARTITION OF "RequestLog" FOR VALUES FROM ('2025-01-01') TO ('2025-02-01'); ``` ### SQLServer / MySQL - Manual Partition Management For **SQL Server** and **MySQL**, monthly partitioned tables need to be created **out-of-band** (either manually or via cronjob scripts) since they don't support the same level of automatic partition management as PostgreSQL. However, this still works well in practice as because `RequestLog` is an **Append Only** table with all querying from the Admin UIs being filtered by its indexed `CreatedDate` in monthly viewable snapshots like it was with SQLite. ### Separate RequestLog Database Or if preferred, you can maintain request logs in a **separate database** from your main application database. This separation keeps the write-heavy logging load off your primary database, allowing you to optimize each database independently for its specific workload patterns like maintaining different backup strategies for your critical application data vs. log history. ```csharp // Configure.Db.cs services.AddOrmLite(options => options.UsePostgres(connectionString)) .AddPostgres("logs", logsConnectionString); // Configure.RequestLogs.cs services.AddPlugin(new RequestLogsFeature { RequestLogger = new DbRequestLogger { NamedConnection = "logs" }, //... }); ``` ## Queryable Admin Logging UI This will enable a more feature rich Request Logging Admin UI which utilizes the full queryability of the [AutoQueryGrid](https://docs.servicestack.net/vue/autoquerygrid) component to filter, sort and export Request Logs. [![](/img/pages/admin-ui/sqlitelogs.webp)](/img/pages/admin-ui/sqlitelogs.webp) ## Analytics Overview Utilizing an `DbRequestLogger` also enables the **Analytics** Admin UI in the sidebar which initially displays the API Analytics Dashboard: :::{.wideshot} [![](/img/pages/admin-ui/analytics-apis1.webp)](/img/pages/admin-ui/analytics-apis1.webp) ::: ### Distribution Pie Charts Lets you quickly understand the composition of your user base and traffic sources and the distribution of users across different web browsers, device types, and to identify the proportion of traffic coming from automated bots. ### Requests per day Line Chart Lets you monitor API usage trends and performance over time. It tracks the total number of API requests and the average response time day-by-day. You can easily spot trends like peak usage hours/days, identify sudden spikes or drops in traffic, and correlate request volume with API performance which is crucial for capacity planning and performance troubleshooting. ### API tag groups Pie Chart Lets you understand the usage patterns across different functional categories of your APIs. By grouping API requests based on assigned tags (like Security, Authentication, User Management, Tech, etc.), you get a high-level view of which *types* of functionalities are most frequently used or are generating the most load. ### API Requests Bar Chart Lets you identify the most and least frequently used specific API endpoints which ranks individual API endpoints by the number of requests they receive. This helps pinpoint: - **Critical Endpoints:** The most heavily used APIs that require robust performance and monitoring. - **Optimization Targets:** High-traffic endpoints that could benefit from performance optimization. - **Underutilized Endpoints:** APIs that might be candidates for deprecation or require promotion. - **Troubleshooting:** If performance issues arise (seen in the line chart), this helps narrow down which specific endpoint might be responsible. :::{.wideshot} [![](/img/pages/admin-ui/analytics-apis2.webp)](/img/pages/admin-ui/analytics-apis2.webp) ::: ### Total Duration Bar Chart Identifies which API endpoints consume the most *cumulative processing time* over the selected period. Even if an API endpoint is relatively fast per call, if it's called extremely frequently, it can contribute significantly to overall server load. Optimizing these can lead to significant savings in server resources (CPU, memory). ### Average Duration Bar Chart Pinpoints which API endpoints are the slowest on a *per-request* basis. APIs at the top of this list are prime candidates for performance investigation and optimization, as they represent potential user-facing slowness or system bottlenecks. ### Requests by Duration Ranges Histogram Provides an overview of the performance distribution for *all* API requests. This chart shows how many requests fall into different speed buckets and helps you understand the overall responsiveness of your API system at a glance. ## Individual API Analytics Clicking on an API's bar chart displays a dedicated, detailed view of a single API endpoint's behavior, isolating its performance and usage patterns from the overall system metrics offering immediate insight into the endpoint's traffic volume and reliability. :::{.wideshot} [![](/img/pages/admin-ui/analytics-api.webp)](/img/pages/admin-ui/analytics-api.webp) ::: ### Total Requests Displays the total requests for an API during the selected month. It includes HTTP Status Breakdown which provide **direct access to the filtered request logs**. This is a major benefit for **rapid troubleshooting**, allowing you to instantly view the specific log entries corresponding to successful requests or particular error codes for this API. ### Last Request Information Provides immediate context on the most recent activity for this endpoint with *when* the last request occurred, the source **IP address** and device information to help understand recent usage and check if the endpoint is still active, or quickly investigate the very last interaction if needed. ### Duration Summary Table (Total, Min, Max) Quantifies the performance characteristics specifically for this endpoint with the cumulative (Total) processing load, the best-case performance (Min), and the worst-case performance (Max) which is useful for identifying performance outliers. ### Duration Requests Histogram Visualizes the performance distribution for this API. ### Top Users Bar Chart Identifies which authenticated users are most frequently calling this API and relies on this endpoint the most. This can be useful for identifying power users, potential API abuse by a specific user account, or understanding the impact of changes to this API on key users. ### Top IP Addresses Bar Chart Shows which source IP addresses are generating the most traffic for this API. Useful for identifying high-volume clients, specific servers interacting with this endpoint, or potentially malicious IPs. ## Users The **Users** tab will display the top 100 Users who make the most API Requests and lets you click on a Users bar chart to view their individual User analytics. :::{.wideshot} [![](/img/pages/admin-ui/analytics-users.webp)](/img/pages/admin-ui/analytics-users.webp) ::: ### Individual User Analytics Provides a comprehensive view of a single user's complete interaction history and behavior across all APIs they've accessed, shifting the focus from API performance to user experience and activity. :::{.wideshot} [![](/img/pages/admin-ui/analytics-user.webp)](/img/pages/admin-ui/analytics-user.webp) ::: ### User Info & Total Requests Identifies the user and quantifies their overall activity level. Clicking on their ID or Name will navigate to the Users Admin UI. It also shows their success/error rate via the clickable status code links. This helps gauge user engagement and baseline activity. ### Last Request Information Offers a snapshot of the user's most recent interaction for immediate context. Knowing **when**, **what** API they called, from which **IP address**, using which **client** & **device** is valuable for support, identifying their last action or checking recent activity. ### HTTP Status Pie Chart Visualizes the overall success and error rate specifically for this user's API requests. ### Performance & Request Body Summary Table Quantifies the performance experienced by this user and the data they typically send. ### Duration Requests Histogram Shows the distribution of response times for requests made by this user to help understand the typical performance this user experiences. ### Top APIs Bar Chart Reveals which API endpoints this user interacts with most frequently and help understanding user behavior and which features they use most. ### Top IP Addresses Bar Chart Identifies the primary network locations or devices the user connects from. ### User Admin UI Analytics To assist in discoverability a snapshot of a Users Analytics is also visible in the Users Admin UI: [![](/img/pages/admin-ui/analytics-user-adminui.webp)](/img/pages/admin-ui/analytics-user-adminui.webp) Clicking on **View User Analytics** takes you to the Users Analytics page to access to the full Analytics features and navigation. ## API Keys The **API Keys** tab will display the top 100 API Keys who make the most API Requests and lets you click on an API Key bar chart to view its individual API Key analytics. :::{.wideshot} [![](/img/pages/admin-ui/analytics-apikeys.webp)](/img/pages/admin-ui/analytics-apikeys.webp) ::: ### Individual API Key Analytics Provides comprehensive API Key analytics Similar to User Analytics but limited to the API Usage of a single API Key: :::{.wideshot} [![](/img/pages/admin-ui/analytics-apikey.webp)](/img/pages/admin-ui/analytics-apikey.webp) ::: ## IPs The **IP Addresses** tab will display the top 100 IPs that make the most API Requests. Click on an IP's bar chart to view its individual analytics made from that IP Address. :::{.wideshot} [![](/img/pages/admin-ui/analytics-ips.webp)](/img/pages/admin-ui/analytics-ips.webp) ::: ### Individual IP Analytics Provides comprehensive IP Address analytics Similar to User Analytics but limited to the API Usage from a single IP Address: :::{.wideshot} [![](/img/pages/admin-ui/analytics-ip.webp)](/img/pages/admin-ui/analytics-ip.webp) ::: ## Protect same APIs with API Keys or Identity Auth Modern APIs need to serve different types of clients, each with distinct authentication requirements. Understanding when to use **Identity Auth** versus **API Keys** is crucial to optimize for security, performance, and user experience. ### Two Auth Paradigms for Different Use Cases ### Identity Auth: User → API **Identity Auth** is designed for scenarios where a **human user** is interacting with your API, typically through a web or mobile application which: - Requires user credentials (username/password, OAuth, etc.) - Establishes a user session with roles and permissions - For interactive workflows like logins, password resets & email confirmation - Enables user-specific features like profile management and personalized UX - Provides full access to user context, claims, and role-based authorization ### API Keys: Machine → API / User Agent → API **API Keys** are purpose-built for **machine-to-machine** communication or **user agents** accessing your API programmatically, without interactive user authentication. This authentication model: - Provides simple, token-based authentication without user sessions - Enables fine-grained access control through scopes and features - Supports non-interactive scenarios like scripts, services, and integrations - Can optionally be associated with a user but doesn't run in their context - Offers superior performance by avoiding the auth workflow overhead - Supports project based billing and usage metrics by API Key **Common scenarios:** - Microservices communicating with each other - Third-party integrations accessing your API - CLI tools and scripts that need API access - Mobile apps or SPAs making direct API calls without user context - Webhooks and automated processes - Providing API access to partners or customers with controlled permissions Despite serving 2 different use-cases there are a few times when you may want to serve the same API with both Identity Auth and API Keys. ### Supporting both Auth Models with 2 APIs Previously you would've needed to maintain two separate APIs, one protected with Identity Auth and another with API Keys. Thanks to ServiceStack's message-based APIs and [built-in Auto Mapping](https://docs.servicestack.net/auto-mapping) this is fairly easy to do: ```csharp // For authenticated users [ValidateIsAuthenticated] public class QueryOrders : QueryDb { } // For API key access [ValidateApiKey] public class QueryOrdersApiKey : QueryDb { } public class OrderService : Service { public List Get(GetOrders request) { var userId = Request.GetRequiredUserId(); // Shared business logic } public List Get(GetOrdersViaApiKey request) => Get(request.ConvertTo()); } public static class MyExtensions { public static string GetRequiredUserId(this IRequest? req) => req.GetApiKey()?.UserAuthId ?? req.GetClaimsPrincipal().GetUserId() ?? throw HttpError.Unauthorized("API Key must be associated with a user"); } ``` Whilst easy to implement, the biggest draw back with this approach is that it requires maintaining 2x APIs, 2x API endpoints, and 2x API docs. ## The Best of Both Worlds ServiceStack's flexible [API Keys feature](https://docs.servicestack.net/auth/apikeys) now allows you to protect the same APIs with **both** Identity Auth and API Keys, enabling you to: - Maintain a single API surface for all clients - Serve the same interactive UIs protected with Identity Auth or API Keys - Provide programmatic access via API Keys - Maintain all the benefits of API Keys To achieve this, users will need to have a valid API Key generated for them which would then need to be added to the `apikey` Claim in the `UserClaimsPrincipalFactory` to be included in their Identity Auth Cookie: ```csharp // Program.cs services.AddScoped, AdditionalUserClaimsPrincipalFactory>(); // Add additional claims to the Identity Auth Cookie public class AdditionalUserClaimsPrincipalFactory( UserManager userManager, RoleManager roleManager, IApiKeySource apiKeySource, IOptions optionsAccessor) : UserClaimsPrincipalFactory( userManager, roleManager, optionsAccessor) { public override async Task CreateAsync(ApplicationUser user) { var principal = await base.CreateAsync(user); var identity = (ClaimsIdentity)principal.Identity!; var claims = new List(); if (user.ProfileUrl != null) { claims.Add(new Claim(JwtClaimTypes.Picture, user.ProfileUrl)); } // Add Users latest valid API Key to their Auth Cookie's 'apikey' claim var latestKey = (await apiKeySource.GetApiKeysByUserIdAsync(user.Id)) .OrderByDescending(x => x.CreatedDate) .FirstOrDefault(); if (latestKey != null) { claims.Add(new Claim(JwtClaimTypes.ApiKey, latestKey.Key)); } identity.AddClaims(claims); return principal; } } ``` After which Authenticated Users will be able to access `[ValidateApiKey]` protected APIs where it attaches the API Key in the `apikey` Claim to the request - resulting in the same behavior had they sent their API Key with the request. ```csharp // For authenticated users or API Keys [ValidateApiKey] public class QueryOrders : QueryDb { } ``` ## AI Chat We're excited to introduce **AI Chat** — a refreshingly simple solution for integrating AI into your applications by unlocking the full value of the OpenAI Chat API. Unlike most other OpenAI SDKs and Frameworks, all of AI Chat's features are centered around arguably the most important API in our time - OpenAI's simple [Chat Completion API](https://platform.openai.com/docs/api-reference/chat) i.e. the primary API used to access Large Language Models (LLMs). We've had several attempts at adding a valuable layer of functionality for harnessing AI into our Apps, including: - [GptAgentFeature](https://servicestack.net/posts/chat-gpt-agents) - Use Semantic Kernel to implement Autonomous agents with Chain-of-Thought reasoning - [TypeScript TypeChat](https://servicestack.net/posts/typescript-typechat-examples) - Use Semantic Kernel to implement all of TypeScript's TypeChat examples in .NET - [ServiceStack.AI](https://servicestack.net/posts/servicestack-ai) - TypeChat providers and unified Abstractions over AWS, Azure and Google AI Providers The problem being that we wouldn't consider any of these solutions to be relevant today, any "smarts" or opinionated logic added look to become irrelevant as AI models get more capable and intelligent. ## The Problem with Complex Abstractions Over the years, we've seen AI integration libraries grow in complexity. Take [Microsoft Semantic Kernel](https://github.com/microsoft/semantic-kernel) - a sprawling codebase that maintains its own opinionated abstractions that aren't serializable and has endured several breaking changes over the years. After investing development effort in catching up with their breaking changes we're now told to [Migrate to Agent Framework](https://learn.microsoft.com/en-us/agent-framework/migration-guide/from-semantic-kernel/). The fundamental issue? These complex abstractions didn't prove to be reusable. Microsoft's own next competing solution [Agent Framework](https://github.com/microsoft/agent-framework) - doesn't even use Semantic Kernel Abstractions. Instead, it maintains its own non-serializable complex abstractions, repeating the same architectural issues. This pattern of building heavyweight, non-portable abstractions creates vendor lock-in, adds friction, hinders reuse, and limits how and where it can be used. After getting very little value from Semantic Kernel, we don't plan for any rewrites to follow adoption of their next over-engineered framework. ## Back to OpenAI Chat The only AI Abstraction we feel confident that has any longevity in this space, that wont be subject to breaking changes and rewrites is the underlying OpenAI Chat Completion API itself. The API with the most utility, with all the hard work of having AI Providers adopt this common API already done for us, we just have to facilitate calling it. Something so simple that it can be easily called from a shell script: ```bash RESPONSE=$(curl https://api.openai.com/v1/chat/completions \ -H "Authorization: Bearer $OPENAI_API_KEY" \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-5", "messages": [{"role": "user", "content": "Capital of France?"}] }') echo "$RESPONSE" | jq -r '.choices[0].message.content' ``` Shouldn't require complex libraries over several NuGet packages to make use of. The simplest and obvious solution is design around the core `ChatCompletion` DTO itself - a simple, serializable, implementation-free data structure that maps directly to the OpenAI Chat API request body maintained in [ChatCompletion.cs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/ChatCompletion.cs) with all its functionality encapsulated (no third-party dependencies) within the new **ServiceStack.AI.Chat** NuGet package. Using DTOs gives us all the natural [advantages of message-based APIs](https://docs.servicestack.net/advantages-of-message-based-web-services) whose clean POCO models helps us [fight against complexity](https://docs.servicestack.net/service-complexity-and-dto-roles). ### Why This Matters Because `ChatCompletion` is a plain serializable DTO, you can: - **Store it in a database** - Save conversation history, audit AI requests, or implement retry logic - **Use it in client workflows** - Pass the same DTO between frontend and backend without transformations - **Send it through message queues** - Build asynchronous AI processing pipelines with RabbitMQ and others - **Debug easily** - Inspect the exact JSON being sent to OpenAI - **Test easily** - Mock AI responses with simple DTOs or JSON payloads - **Use it outside the library** - The DTO works independently of any specific client implementation More importantly, because it's a **Request DTO**, we unlock a wealth of ServiceStack features for free, since most of ServiceStack's functionality is designed around Request DTOs — which we'll explore later. ## Install AI Chat can be added to any .NET 8+ project by installing the **ServiceStack.AI.Chat** NuGet package and configuration with: :::sh x mix chat ::: Which drops this simple [Modular Startup](https://docs.servicestack.net/modular-startup) that adds the `ChatFeature` and registers a link to its UI on the [Metadata Page](https://docs.servicestack.net/metadata-page) if you want it: ```csharp public class ConfigureAiChat : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new ChatFeature()); services.ConfigurePlugin(feature => { feature.AddPluginLink("/chat", "AI Chat"); }); }); } ``` #### Prerequisites: As AI Chat protects its APIs and UI with Identity Auth or API Keys, you'll need to enable the [API Keys Feature](https://docs.servicestack.net/auth/apikeys) if you haven't already: :::sh x mix apikeys ::: ## Simple, Not Simplistic How simple is it to use? It's just as you'd expect, your App logic need only bind to a simple `IChatClient` interface that accepts a Typed `ChatCompletion` Request DTO and returns a Typed `ChatResponse` DTO: ```csharp public interface IChatClient { Task ChatAsync( ChatCompletion request, CancellationToken token=default); } ``` An impl-free easily substitutable interface for calling any OpenAI-compatible Chat API, using clean Typed `ChatCompletion` and `ChatResponse` DTOs. Unfortunately since the API needs to be typed and .NET Serializers don't have support for de/serializing union types yet, the DTO adopts OpenAI's more verbose and flexible multi-part Content Type which looks like: ```csharp IChatClient client = CreateClient(); var request = new ChatCompletion { Model = "gpt-5", Messages = [ new() { Role = "user", Content = [ new AiTextContent { Type = "text", Text = "Capital of France?" } ], } ] }; var response = await client.ChatAsync(request); ``` To improve the UX we've added a [Message.cs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/Message.cs) helper which encapsulates the boilerplate of sending **Text**, **Image**, **Audio** and **Files** into more succinct and readable code where you'd typically only need to write: ```csharp var request = new ChatCompletion { Model = "gpt-5", Messages = [ Message.SystemPrompt("You are a helpful assistant"), Message.Text("Capital of France?"), ] }; var response = await client.ChatAsync(request); string? answer = response.GetAnswer(); ``` ### Same ChatCompletion DTO, Used Everywhere That's all that's required for your internal App Logic to access your App's configured AI Models. However, as AI Chat also makes its own OpenAI Compatible API available, your external .NET Clients can use the **same exact DTO** to get the **same Response** by calling your API with a [C# Service Client](https://docs.servicestack.net/csharp-client): ```csharp var client = new JsonApiClient(BaseUrl) { BearerToken = apiKey }; var response = await client.SendAsync(request); ``` ### Support for Text, Images, Audio & Files For Multi-modal LLMs which support it, you can also send Images, Audio & File attachments with your AI Request using **URLs**, e.g: ```csharp var image = new ChatCompletion { Model = "qwen2.5vl", Messages = [ Message.Image(imageUrl:"https://example.org/image.webp", text:"Describe the key features of the input image"), ] } var audio = new ChatCompletion { Model = "gpt-4o-audio-preview", Messages = [ Message.Audio(data:"https://example.org/speaker.mp3", text:"Please transcribe and summarize this audio file"), ] }; var file = new ChatCompletion { Model = "gemini-flash-latest", Messages = [ Message.File( fileData:"https://example.org/order.pdf", text:"Please summarize this document"), ] }; ``` #### Relative File Path If a [VirtualFiles Provider](https://docs.servicestack.net/virtual-file-system) was configured, you can specify a relative path instead: ```csharp var image = new ChatCompletion { Model = "qwen2.5vl", Messages = [ Message.Image(imageUrl:"/path/to/image.webp", text:"Describe the key features of the input image"), ] }; ``` #### Manual Download & Embedding Alternatively you can embed and send the raw Base64 Data or Data URI yourself: ```csharp var bytes = await "https://example.org/image.webp".GetBytesFromUrlAsync(); var dataUri = $"data:image/webp;base64,{Convert.ToBase64String(bytes)}"; var image = new ChatCompletion { Model = "qwen2.5vl", Messages = [ Message.Image(imageUrl:dataUri, text:"Describe the key features of the input image"), ] }; ``` Although sending references to external resources allows keeping AI Requests payloads small, making them easier to store in Databases, send in MQs and client workflows, etc. This illustrates some of the "value-added" features of AI Chat where it will automatically download any URL Resources and embed it as Base64 Data in the `ChatCompletion` Request DTO. ### Configure Downloads Relative paths can be enabled by configuring a `VirtualFiles` Provider to refer to a safe path that you want to allow access to. Whilst URLs are downloaded by default, but its behavior can be customized with `ValidateUrl` or replaced entirely with `DownloadUrlAsBase64Async`: ```csharp services.AddPlugin(new ChatFeature { // Enable Relative Path Downloads VirtualFiles = new FileSystemVirtualFiles(assetDir), // Validate URLs before download ValidateUrl = url => { if (!IsAllowedUrl(url)) throw HttpError.Forbidden("URL not allowed"); }, // Use Custom URL Downloader // DownloadUrlAsBase64Async = async (provider, url) => { // var (base64, mimeType) = await MyDownloadAsync(url); // return (base64, mimeType); // }, }); ``` ## Configure AI Providers By default AI Chat is configured with a list of providers in its `llms.json` which is pre-configured with the best models from the leading LLM providers. The easiest way to use a custom `llms.json` is to add a local modified copy of [llms.json](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/chat/llms.json) to your App's `/wwwroot/chat` folder: ```files /wwwroot /chat llms.json ``` If you just need to change which providers are enabled you can specify them in `EnableProviders`: ```csharp services.AddPlugin(new ChatFeature { // Specify which providers you want to enable EnableProviders = [ "openrouter_free", "groq", "google_free", "codestral", "ollama", "openrouter", "google", "anthropic", "openai", "grok", "qwen", "z.ai", "mistral", ], // Use custom llms.json configuration ConfigJson = vfs.GetFile("App_Data/llms.json").ReadAllText(), }); ``` Alternatively you can use `ConfigJson` to load a custom JSON provider configuration from a different source, which you'll want to use if you prefer to keep your provider configuration and API Keys all in `llms.json`. ### llms.json - OpenAI Provider Configuration [llms.json](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/chat/llms.json) contains a list of OpenAI Compatible Providers you want to make available along with a user-defined **model alias** you want to use for model routing along with the provider-specific model name it maps to when the model is used with that provider, e.g: ```json { "providers": { "openrouter": { "enabled": false, "type": "OpenAiProvider", "base_url": "https://openrouter.ai/api", "api_key": "$OPENROUTER_API_KEY", "models": { "grok-4": "x-ai/grok-4", "glm-4.5-air": "z-ai/glm-4.5-air", "kimi-k2": "moonshotai/kimi-k2", "deepseek-v3.1:671b": "deepseek/deepseek-chat", "llama4:400b": "meta-llama/llama-4-maverick" } }, "anthropic": { "enabled": false, "type": "OpenAiProvider", "base_url": "https://api.anthropic.com", "api_key": "$ANTHROPIC_API_KEY", "models": { "claude-sonnet-4-0": "claude-sonnet-4-0" } }, "ollama": { "enabled": false, "type": "OllamaProvider", "base_url": "http://localhost:11434", "models": {}, "all_models": true }, "google": { "enabled": false, "type": "GoogleProvider", "api_key": "$GOOGLE_API_KEY", "models": { "gemini-flash-latest": "gemini-flash-latest", "gemini-flash-lite-latest": "gemini-flash-lite-latest", "gemini-2.5-pro": "gemini-2.5-pro", "gemini-2.5-flash": "gemini-2.5-flash", "gemini-2.5-flash-lite": "gemini-2.5-flash-lite" }, "safety_settings": [ { "category": "HARM_CATEGORY_DANGEROUS_CONTENT", "threshold": "BLOCK_ONLY_HIGH" } ], "thinking_config": { "thinkingBudget": 1024, "includeThoughts": true } }, //... } } ``` The only non-OpenAI Chat Provider AI Chat supports is `GoogleProvider`, where an exception was made to add explicit support for Gemini's Models given its low cost and generous free quotas. ### Provider API Keys API Keys can be either be specified within the `llms.json` itself, alternatively API Keys starting with `$` like `$GOOGLE_API_KEY` will first try to resolve it from `Variables` before falling back to checking Environment Variables. ```csharp services.AddPlugin(new ChatFeature { EnableProviders = [ "openrouter", "anthropic", "google", ], Variables = { ["OPENROUTER_API_KEY"] = secrets.OPENROUTER_API_KEY, ["ANTHROPIC_API_KEY"] = secrets.ANTHROPIC_API_KEY, ["GOOGLE_API_KEY"] = secrets.GOOGLE_API_KEY, } }); ``` ### Model Routing and Failover Providers are invoked in the order they're defined in `llms.json` that supports the requested model. If a provider fails, it tries the next available provider. This enables scenarios like: - Routing different request types to different providers - Optimize by Cost, Performance, Reliability, or Privacy - A/B testing different models - Added resilience with fallback when a provider is unavailable The model aliases don't need to identify a model directly, e.g. you could use your own artificial names for use-cases you need like `image-captioner`, `audio-transcriber`, `pdf-extractor` then map them to different models different providers should use to achieve the desired task. #### Use Model Routing with Fallback To make use of the model routing and fallback you would call `ChatAsync` on `IChatClient` directly: ```csharp class MyService(IChatClient client) { public async Task Any(DefaultChat request) { return await client.ChatAsync(new ChatCompletion { Model = "glm-4.6", Messages = [ Message.Text(request.UserPrompt) ], }); } } ``` #### Use Specific Provider Alternatively to use a specific provider, you can use `IChatClients` dependency `GetClient(providerId)` method to resolve the provider then calling `ChatAsync` will only use that provider: ```csharp class MyService(IChatClients clients) { public async Task Any(ProviderChat request) { var groq = clients.GetClient("groq"); return await groq.ChatAsync(new ChatCompletion { Model = "kimi-k2", Messages = [ Message.Text(request.UserPrompt) ], }); } } ``` ## Persist AI Chat History By default AI Chat is designed to be minimally invasive and doesn't require anything other than the API Keys needed to access the AI Models it should use. If preferred you can choose to persist AI Chat History made through the external ChatCompletion API with the `OnChatCompletionSuccessAsync` and `OnChatCompletionFailedAsync` callbacks which can be used to store successful and failed requests in your preferred data store using the included [ChatCompletionLog](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/ChatCompletionLog.cs) or your own data model: ```csharp public class ConfigureAiChat : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new ChatFeature { OnChatCompletionSuccessAsync = async (request, response, req) => { using var db = await req.Resolve().OpenAsync(); await db.InsertAsync(req.ToChatCompletionLog(request, response)); }, OnChatCompletionFailedAsync = async (request, exception, req) => { using var db = await req.Resolve().OpenAsync(); await db.InsertAsync(req.ToChatCompletionLog(request, exception)); }, }); }).ConfigureAppHost(appHost => { using var db = appHost.Resolve().Open(); db.CreateTableIfNotExists(); }); } ``` ### Compatible with llms.py The other benefit of simple configuration and simple solutions, is that they're easy to implement. A perfect example of this being that this is the 2nd implementation done using this configuration. The same configuration, UI, APIs and functionality is also available in our [llms.py](https://github.com/ServiceStack/llms) Python CLI and server gateway we've developed in order to have a dependency-free LLM Gateway solution needed in our ComfyUI Agents. :::sh pip install llms-py ::: This also means you can use and test your own custom `llms.json` configuration on the command-line or in shell automation scripts: ```sh # Simple question llms "Explain quantum computing" # With specific model llms -m gemini-2.5-pro "Write a Python function to sort a list" # With system prompt llms -s "You are a helpful coding assistant" "Reverse a string in Python?" # With image (vision models) llms --image image.jpg "What's in this image?" llms --image https://example.com/photo.png "Describe this photo" # Display full JSON Response llms "Explain quantum computing" --raw # Start the UI and an OpenAI compatible API on port 8000: llms --serve 8000 ``` Incidentally as [llms.py UI](https://servicestack.net/posts/llms-py-ui) and AI Chat utilize the same UI you can use its **import/export** features to transfer your AI Chat History between them. Checkout the [llms.py GitHub repo](https://github.com/ServiceStack/llms) for even more features. ## AI Chat UI Another major value proposition of [AI Chat](https://servicestack.net/posts/ai-chat) is being able to offer a ChatGPT-like UI to your users where you're able to control the API Keys, billing, and sanctioned providers your users can access to maintain your own **Fast, Local, and Private** access to AI from within your own organization. ### Identity Auth or Valid API Key AI Chat makes of ServiceStack's new [API Keys or Identity Auth APIs](https://servicestack.net/posts/apikey_auth_apis) which allows usage for both Authenticated Identity Auth users otherwise unauthenticated users will need to provide a valid API Key: :::{.shadow} [![](/img/pages/ai-chat/ai-chat-ui-apikey.webp)](/img/pages/ai-chat/ai-chat-ui-apikey.webp) ::: If needed `ValidateRequest` can be used to further restrict access to AI Chat's UI and APIs, e.g. you can restrict access to API Keys with the `Admin` scope with: ```csharp services.AddPlugin(new ChatFeature { ValidateRequest = async req => req.GetApiKey()?.HasScope(RoleNames.Admin) == true ? null : HttpResult.Redirect("/admin-ui"), }); ``` ### Import / Export All data is stored locally in the users local browser's IndexedDB. When needed you can backup and transfer your entire chat history between different browsers using the **Export** and **Import** features on the home page. :::{.wideshot} [![llms-home.webp](/img/pages/ai-chat/llms-home.webp)](/img/pages/ai-chat/llms-home.webp) ::: ## Simple and Flexible UI Like all of [ServiceStack's built-in UIs](https://servicestack.net/auto-ui), AI Chat is also [naturally customizable](https://docs.servicestack.net/locode/custom-overview) where you can override any of [AI Chat's Vue Components](https://github.com/ServiceStack/ServiceStack/tree/main/ServiceStack/src/ServiceStack.AI.Chat/chat) and override them with your own by placing them in your [/wwwroot/chat](https://github.com/ServiceStack/ServiceStack/tree/main/ServiceStack/tests/AdhocNew/wwwroot/chat) folder: ```files /wwwroot /chat Brand.mjs Welcome.mjs ``` Where you'll be able to customize the appearance and behavior of AI Chat's UI to match your App's branding and needs. :::{.wideshot} [![](/img/pages/ai-chat/ai-chat-custom-ui.webp)](/img/pages/ai-chat/ai-chat-custom-ui.webp) ::: ## Customize The built-in [ui.json](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/chat/ui.json) configuration can be overridden with your own to use your preferred system prompts and other defaults by adding them to your local folder: ```files /wwwroot /chat llms.json ui.json ``` Alternatively `ConfigJson` and `UiConfigJson` can be used to load custom JSON configuration from a different source, e.g: ```csharp services.AddPlugin(new ChatFeature { // Use custom llms.json configuration ConfigJson = vfs.GetFile("App_Data/llms.json").ReadAllText(), // Use custom ui.json configuration UiConfigJson = vfs.GetFile("App_Data/ui.json").ReadAllText(), }); ``` ## Rich Markdown & Syntax Highlighting To maximize readability there's full support for Markdown and Syntax highlighting for the most popular programming languages. :::{.wideshot} [![llms-syntax.webp](/img/pages/ai-chat/llms-syntax.webp)](/img/pages/ai-chat/llms-syntax.webp) ::: To quickly and easily make use of AI Responses, **Copy Code** icons are readily available on hover of all messages and code blocks. ## Rich, Multimodal Inputs The Chat UI goes beyond just text and can take advantage of the multimodal capabilities of modern LLMs with support for Image, Audio, and File inputs. ### 🖼️ 1. Image Inputs & Analysis Images can be uploaded directly into your conversations with vision-capable models for comprehensive image analysis. Visual AI Responses are highly dependent on the model used. This is a typical example of the visual analysis provided by the latest Gemini Flash of our [ServiceStack Logo](/img/logo.png): :::{.wideshot} [![llms-image.webp](/img/pages/ai-chat/llms-image.webp)](/img/pages/ai-chat/llms-image.webp) ::: ### 🎤 2. Audio Input & Transcription Likewise you can upload Audio files and have them transcribed and analyzed by multi-modal models with audio capabilities. :::{.wideshot} [![llms-audio.webp](/img/pages/ai-chat/llms-audio.webp)](/img/pages/ai-chat/llms-audio.webp) ::: Example of processing audio input. Audio files can be uploaded with system and user prompts to instruct the model to transcribe and summarize its content where its multi-modal capabilities are integrated right within the chat interface. ### 📎 3. File and PDF Attachments In addition to images and audio, you can also upload documents, PDFs, and other files to capable models to extract insights, summarize content or analyze. **Document Processing Use Cases:** - **PDF Analysis**: Upload PDF documents for content extraction and analysis - **Data Extraction**: Extract specific information from structured documents - **Document Summarization**: Get concise summaries of lengthy documents - **Query Content**: Ask questions about specific content in documents - **Batch Processing**: Upload multiple files for comparative analysis Perfect for research, document review, data analysis, and content extractions. :::{.wideshot} [![llms-files.webp](/img/pages/ai-chat/llms-files.webp)](/img/pages/ai-chat/llms-files.webp) ::: ## Custom AI Chat Requests Send Custom Chat Completion requests through the settings dialog, allowing Users to fine-tune their AI requests with advanced options including: - **Temperature** `(0-2)` for controlling response randomness - **Max Completion Tokens** to limit response length - **Seed** values for deterministic sampling - **Top P** `(0-1)` for nucleus sampling - **Frequency** & **Presence Penalty** `(-2.0 to 2.0)` for reducing repetition - **Stop** Sequences to control where the API stops generating - **Reasoning Effort** constraints for reasoning models - **Top Logprobs** `(0-20)` for token probability analysis - **Verbosity** settings :::{.wideshot} [![llms-settings.webp](/img/pages/ai-chat/llms-settings.webp)](/img/pages/ai-chat/llms-settings.webp) ::: ## Enable / Disable Providers **Admin** Users can manage which providers they want enabled or disabled at runtime. Providers are invoked in the order they're defined in `llms.json` that supports the requested model. If a provider fails, it tries the next available one. By default `llms.json` defines providers with Free tiers first, followed by local providers and then premium cloud providers which can all be enabled or disabled from the UI: :::{.wideshot} [![llms-providers.webp](/img/pages/ai-chat/llms-providers.webp)](/img/pages/ai-chat/llms-providers.webp) ::: ## Search History Quickly find past conversations with built-in search: :::{.wideshot} [![llms-search-python.webp](/img/pages/ai-chat/llms-search-python.webp)](/img/pages/ai-chat/llms-search-python.webp) ::: ## Smart Autocomplete for Models & System Prompts Autocomplete components are used to quickly find and select the preferred model and system prompt. Only models from enabled providers will appear in the drop down, which will be available immediately after providers are enabled. :::{.wideshot} [![llms-autocomplete.webp](/img/pages/ai-chat/llms-autocomplete.webp)](/img/pages/ai-chat/llms-autocomplete.webp) ::: ## Comprehensive System Prompt Library Access a curated collection of 200+ professional system prompts designed for various use cases, from technical assistance to creative writing. :::{.wideshot} [![llms-system-prompt.webp](/img/pages/ai-chat/llms-system-prompt.webp)](/img/pages/ai-chat/llms-system-prompt.webp) ::: System Prompts be can added, removed & sorted in your `ui.json` ```json { "prompts": [ { "id": "it-expert", "name": "Act as an IT Expert", "value": "I want you to act as an IT expert. You will be responsible..." }, ... ] } ``` ### Reasoning Access the thinking process of advanced AI models with specialized rendering for reasoning and chain-of-thought responses: :::{.wideshot} [![llms-reasoning.webp](/img/pages/ai-chat/llms-reasoning.webp)](/img/pages/ai-chat/llms-reasoning.webp) ::: We're excited to get AI Chat in customers hands. Please [let us know](https://servicestack.net/ideas) of any other missing features you'd love to see implemented. ## Creating a custom Explorer UI for OpenAIs Chat API Anyone who's used ServiceStack's built-in [API Explorer](https://docs.servicestack.net/api-explorer) or [Auto HTML API](https://docs.servicestack.net/auto-html-api) UIs know that not all API Explorer UIs are created equal. The differences are more pronounced as APIs get larger and more complex which we can see by comparing it with Swagger UI for rendering [AI Chat's](https://servicestack.net/posts/ai-chat) `ChatCompletion` API: [![](/img/pages/ai-chat/ai-chat-swagger-form.webp)](/img/pages/ai-chat/ai-chat-swagger-form.webp) This is just the tip of the iceberg, the [full-length Swagger UI Screenshot](/img/pages/ai-chat/ai-chat-swagger-long.webp) is absurdly long, past the point of being usable. As expected from a generic UI we get very little assistance from the UI on what values are allowed, the numeric fields aren't number inputs and the only dropdowns we see are for `bool` properties to select from their `true` and `false` values. There's not going to be any chance for it to be able to show App-specific options like which models are currently enabled. ## API Explorer UI By contrast here is the same API rendered with ServiceStack's [API Explorer](https://docs.servicestack.net/api-explorer): [![](/img/pages/ai-chat/ai-chat-form.webp)](/img/pages/ai-chat/ai-chat-form.webp) This is much closer to what you'd expect from a hand-crafted Application UI and far more usable. #### Properties use optimized UI Components It renders an optimized UI for each property, with the **Model**, **Reasoning Effort**, **Service Tier** and **Verbosity** properties all using a [Combobox](https://docs.servicestack.net/vue/combobox) component for quickly searching through a list of supported options, or they can choose to enter a custom value. **Bool** properties use Checkboxes whilst Numeric fields use **number** inputs, with integer properties only allowing integer values and floating point properties being able to step through fractional values. #### UI-specific text hints Each property also contains **placeholder** text and **help** text hints that's more focused and concise than the verbose API documentation. #### HTML client-side validation Client-side HTML validation ensure properties are valid and within any configured min/max values before any request is sent. [![](/img/pages/ai-chat/ai-chat-form-completed.webp)](/img/pages/ai-chat/ai-chat-form-completed.webp) ### Custom Components for Complex Properties The only property that doesn't use a built-in component is `Messages` which is rendered with a custom `ChatMessages` component purpose-built to populate the `List Messages` property. It uses a **Markdown Editor** for the UserPrompt, a collapsible Textarea for any System Prompt and the ability to attach **image**, **audio** & **file** document attachments to the API request. ## How is it done? The entire UI is driven by these [declarative annotations](https://docs.servicestack.net/locode/declarative) added on the [ChatCompletion](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/ChatCompletion.cs) Request DTO: ```csharp [Description("Chat Completions API (OpenAI-Compatible)")] [Notes("The industry-standard, message-based interface for interfacing with Large Language Models.")] public class ChatCompletion : IPost, IReturn { [DataMember(Name = "messages")] [Input(Type = "ChatMessages", Label=""), FieldCss(Field = "col-span-12")] public List Messages { get; set; } = []; [DataMember(Name = "model")] [Input(Type = "combobox", EvalAllowableValues = "Chat.Models", Placeholder = "e.g. glm-4.6", Help = "ID of the model to use")] public string Model { get; set; } [DataMember(Name = "reasoning_effort")] [Input(Type="combobox", EvalAllowableValues = "['low','medium','high','none','default']", Help = "Constrains effort on reasoning for reasoning models")] public string? ReasoningEffort { get; set; } [DataMember(Name = "service_tier")] [Input(Type = "combobox", EvalAllowableValues = "['auto','default']", Help = "Processing type for serving the request")] public string? ServiceTier { get; set; } [DataMember(Name = "safety_identifier")] [Input(Type = "text", Placeholder = "e.g. user-id", Help = "Stable identifier to help detect policy violations")] public string? SafetyIdentifier { get; set; } [DataMember(Name = "stop")] [Input(Type = "tag", Max = "4", Help = "Up to 4 sequences for the API to stop generating tokens")] public List? Stop { get; set; } [DataMember(Name = "modalities")] [Input(Type = "tag", Max = "3", Help = "The output types you would like the model to generate")] public List? Modalities { get; set; } [DataMember(Name = "prompt_cache_key")] [Input(Type = "text", Placeholder = "e.g. my-cache-key", Help = "Used by OpenAI to cache responses for similar requests")] public string? PromptCacheKey { get; set; } [DataMember(Name = "tools")] public List? Tools { get; set; } [DataMember(Name = "verbosity")] [Input(Type = "combobox", EvalAllowableValues = "['low','medium','high']", Placeholder = "e.g. low", Help = "Constrains verbosity of model's response")] public string? Verbosity { get; set; } [DataMember(Name = "temperature")] [Input(Type = "number", Step = "0.1", Min = "0", Max = "2", Placeholder = "e.g. 0.7", Help = "Higher values more random, lower for more focus")] public double? Temperature { get; set; } [DataMember(Name = "max_completion_tokens")] [Input(Type = "number", Value = "2048", Step = "1", Min = "1", Placeholder = "e.g. 2048", Help = "Max tokens for completion (inc. reasoning tokens)")] public int? MaxCompletionTokens { get; set; } [DataMember(Name = "top_logprobs")] [Input(Type = "number", Step = "1", Min = "0", Max = "20", Placeholder = "e.g. 5", Help = "Number of most likely tokens to return with log probs")] public int? TopLogprobs { get; set; } [DataMember(Name = "top_p")] [Input(Type = "number", Step = "0.1", Min = "0", Max = "1", Placeholder = "e.g. 0.5", Help = "Nucleus sampling - alternative to temperature")] public double? TopP { get; set; } [DataMember(Name = "frequency_penalty")] [Input(Type = "number", Step = "0.1", Min = "0", Max = "2", Placeholder = "e.g. 0.5", Help = "Penalize tokens based on frequency in text")] public double? FrequencyPenalty { get; set; } [DataMember(Name = "presence_penalty")] [Input(Type = "number", Step = "0.1", Min = "0", Max = "2", Placeholder = "e.g. 0.5", Help = "Penalize tokens based on presence in text")] public double? PresencePenalty { get; set; } [DataMember(Name = "seed")] [Input(Type = "number", Placeholder = "e.g. 42", Help = "For deterministic sampling")] public int? Seed { get; set; } [DataMember(Name = "n")] [Input(Type = "number", Placeholder = "e.g. 1", Help = "How many chat choices to generate for each input message")] public int? N { get; set; } [Input(Type = "checkbox", Help = "Whether or not to store the output of this chat request")] [DataMember(Name = "store")] public bool? Store { get; set; } [DataMember(Name = "logprobs")] [Input(Type = "checkbox", Help = "Whether to return log probabilities of the output tokens")] public bool? Logprobs { get; set; } [DataMember(Name = "parallel_tool_calls")] [Input(Type = "checkbox", Help = "Enable parallel function calling during tool use")] public bool? ParallelToolCalls { get; set; } [DataMember(Name = "enable_thinking")] [Input(Type = "checkbox", Help = "Enable thinking mode for some Qwen providers")] public bool? EnableThinking { get; set; } [DataMember(Name = "stream")] [Input(Type = "hidden")] public bool? Stream { get; set; } } ``` Which uses the [[Input] attribute](https://docs.servicestack.net/locode/declarative#custom-fields-and-inputs) to control the HTML Input rendered for each property whose `Type` can reference any HTML Input or any [ServiceStack Vue Component](https://docs.servicestack.net/vue/form-inputs) that's either built-in or registered with the Component library. In addition, you also have control to the css of the containing **Field**, **Input** and **Label** elements with the [[FieldCss] attribute](https://docs.servicestack.net/locode/declarative#field) which uses `[FieldCss(Field="col-span-12")]` to render the field to span the full width of the form. The `[Input(Type="hidden")]` is used to hide the `Stream` property from the UI since it is invalid from an API Explorer UI. ### Combobox Values The Combobox `EvalAllowableValues` can reference any JavaScript expression which is evaluated with [#Script](https://sharpscript.net) with the results embedded in the API Metadata that API Explorer uses to render its UI. All combo boxes references a static JS Array except for `Model` which uses `EvalAllowableValues = "Chat.Models"` to invoke the registered `Chat` instance `Models` property which returns an ordered list of all available models from all enabled providers: ```csharp appHost.ScriptContext.Args[nameof(Chat)] = new Chat(this); public class Chat(ChatFeature feature) { public List Models => feature.Providers.Values .SelectMany(x => x.Models.Keys) .Distinct() .OrderBy(x => x) .ToList(); } ``` ### Custom ChatMessages Component The only property that doesn't use a built-in component is: ```csharp [Input(Type = "ChatMessages", Label=""), FieldCss(Field = "col-span-12")] public List Messages { get; set; } = []; ``` Which makes use of a custom `ChatMessages` component in [/modules/ui/components/ChatMessages.mjs](https://github.com/ServiceStack/ServiceStack/blob/main/ServiceStack/src/ServiceStack.AI.Chat/modules/ui/components/ChatMessages.mjs). Custom Components can be added to API Explorer in the same way as [overriding any built-in API Explorer](https://docs.servicestack.net/locode/custom-overview#ui) component by adding it to your local `/wwwroot` folder: ```files /modules /ui /components ChatMessages.mjs ``` All components added to the `/components` folder will be automatically registered and available for use. That's all that's needed to customize the `ChatCompletion` Form UI in API Explorer, for more features and customizations see the [API Explorer Docs](https://docs.servicestack.net/api-explorer). ## XSS Vulnerability fixed in HtmlFormat.html Late in this release cycle a Customer has reported a DOM XSS vulnerability in ServiceStack's built-in [Auto HTML API](/auto-html-api) page that has been fixed in [this commit](https://github.com/ServiceStack/ServiceStack/commit/76df4609410f7b440c3fb153371a1d29b9c06ac0) and available from this ServiceStack v8.9+ release. Alternatively it can also be prevented by rejecting requests with `"` in its path: ```csharp GlobalRequestFilters.Add((req, res, dto) => { if (req.OriginalPathInfo.IndexOf('"') >= 0) throw HttpError.Forbidden("Illegal characters in path"); }); ``` By reverting to the use the old HTML Format: ```csharp ServiceStack.Templates.HtmlTemplates.HtmlFormatName = "HtmlFormatLegacy.html"; ``` Or by disabling the auto rendering of HTML API responses: ```csharp SetConfig(new HostConfig { EnableAutoHtmlResponses = false }) ``` To improve visibility of future security issues we've also created a [Security Vulnerabilities Watchlist](https://github.com/ServiceStack/Discuss/discussions/150), please follow this thread to get notified of any updates. # Release Notes History Source: https://docs.servicestack.net/release-notes-history ## 2025 - [v8.9](/releases/v8_09) - [v8.8](/releases/v8_08) - [v8.7](/releases/v8_07) - [v8.6](/releases/v8_06) ## 2024 - [v8.5](/releases/v8_05) - [v8.4](/releases/v8_04) - [v8.3](/releases/v8_03) - [v8.2](/releases/v8_02) - [v8.1](/releases/v8_01) ## 2023 - [v8](/releases/v8_00) - [v6.11](/releases/v6_11) - [v6.10](/releases/v6_10) - [v6.9](/releases/v6_09) - [v6.8](/releases/v6_08) - [v6.7](/releases/v6_07) - [v6.6](/releases/v6_06) ## 2022 - [v6.5](/releases/v6_05) - [v6.4](/releases/v6_04) - [v6.3](/releases/v6_03) - [v6.2](/releases/v6_02) - [v6.1](/releases/v6_01) - [v6](/releases/v6_00) ## 2021 - [v5.13](/releases/v5_13) - [v5.12](/releases/v5_12) - [v5.11](/releases/v5_11) ## 2020 - [v5.10](/releases/v5_10) - [v5.9](/releases/v5_9) - [v5.8](/releases/v5_8) ## 2019 - [v5.7](/releases/v5_7) - [v5.6](/releases/v5_6) - [v5.5](/releases/v5_5) ## 2018 - [v5.4](/releases/v5_4) - [v5.2](/releases/v5_2) - [v5.1.0](/releases/v5_1_0) - [v5.0.2](/releases/v5_0_0) ## 2017 - [v5.0.0](/releases/v5_0_0#v5-release-notes) - [v4.5.14](/releases/v4_5_14) - [v4.5.10](/releases/v4_5_10) - [v4.5.8](/releases/v4_5_8) - [v4.5.6](/releases/v4_5_6) ## 2016 - [v4.5.2](/releases/v4_5_2) - [v4.5.0](/releases/v4_5_0) - [v4.0.62](/releases/v4_0_62) - [v4.0.60](/releases/v4_0_60) - [v4.0.56](/releases/v4_0_56) - [v4.0.54](/releases/v4_0_54) - [v4.0.52](/releases/v4_0_52) ## 2015 - [v4.0.50](/releases/v4_0_50) - [v4.0.48](/releases/v4_0_48) - [v4.0.46](/releases/v4_0_46) - [v4.0.44](/releases/v4_0_44) - [v4.0.42](/releases/v4_0_42) - [v4.0.40](/releases/v4_0_40) - [v4.0.38](/releases/v4_0_38) - [v4.0.36](/releases/v4_0_36) ## 2014 - [v4.0.35](/releases/v4_0_35) - [v4.0.34](/releases/v4_0_34) - [v4.0.33](/releases/v4_0_33) - [v4.0.32](/releases/v4_0_32) - [v4.0.31](/releases/v4_0_31) - [v4.0.30](/releases/v4_0_30) - [v4.0.24](/releases/v4_0_24) - [v4.0.23](/releases/v4_0_23) - [v4.0.22](/releases/v4_0_22) - [v4.0.21](/releases/v4_0_21) - [v4.0.19](/releases/v4_0_19) - [v4.0.18](/releases/v4_0_18) - [v4.0.15](/releases/v4_0_15) - [v4.0.12](/releases/v4_0_12) - [v4.0.11](/releases/v4_0_11) - [v4.0.10](/releases/v4_0_10) - [v4.0.09](/releases/v4_0_09) - [v4.0.08](/releases/v4_0_08) - [v4.0.06](/releases/v4_0_06) - [v4.0.0](/releases/v4_0_0) ## 2013 and prior - [Older v3 Release Notes](/release-notes-v3) # Pre Release NuGet Packages Source: https://docs.servicestack.net/pre-release ## ServiceStack Pre-Release NuGet Packages Our interim pre-release NuGet packages in between major releases on NuGet are published to [Feedz.io](https://feedz.io/). ::: tip If preferred, the pre-release packages are also available in our [MyGet](/myget) or [GitHub Packages Registry](/gh-nuget) ::: ### Add using Mix If you have the [dotnet x tool](/dotnet-tool) installed, you can configure your projects by downloading `NuGet.Config` in the same folder as your **.sln** :::sh x mix feedz ::: ### Add using VS .NET Instructions to add ServiceStack's Pre-Release packages feed to VS .NET are: 1. Go to **Tools** > **Options** > **Nuget Package Manager** > **Package Sources** 2. Add the Source `https://f.feedz.io/servicestack/pre-release/nuget/index.json` with the name of your choice, e.g. `ServiceStack Pre-Release` After registering the feed it will show up under NuGet package sources when opening the NuGet package manager dialog: ![NuGet Package Manager](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/wikis/myget/package-manager-ui.png) Which will allow you to search and install pre-release packages from the selected Pre Release packages feed. ### Adding Pre-Release NuGet feed without VS .NET If you're not using or don't have VS .NET installed, you can add the MyGet feed to your NuGet.config at `%AppData%\NuGet\NuGet.config`: ```xml ``` ## Redownloading Pre Release packages If you've already packages with the **same version number** from Feedz previously installed, you will need to manually delete the NuGet `/packages` folder for NuGet to pull down the latest packages. ### Clear NuGet Package Cache You can clear your local NuGet packages cache in any OS by running the command-line below in your favorite Terminal: :::sh nuget locals all -clear ::: If `nuget` is not in your Systems `PATH`, it can also be invoked from the `dotnet` tool: :::sh dotnet nuget locals all --clear ::: Within VS .NET you can clear them from **Tools** > **Options** > **Nuget Package Manager** and click **Clear All NuGet Cache(s)**: ![Clear Packages Cache](https://raw.githubusercontent.com/ServiceStack/Assets/master/img/wikis/myget/clear-package-cache.png) Alternatively on Windows you can delete the Cached NuGet packages manually with: :::sh del %LOCALAPPDATA%\NuGet\Cache\*.nupkg /q ::: ### Full Package Clean In most cases clearing the NuGet packages cache will suffice, sometimes you'll also need to manually delete other local packages caches delete all NuGet packages in `/packages` folder: :::sh rd /q /s packages ::: delete `/bin` and `/obj` folders in host project :::sh rd /q /s bin obj ::: ## Versioning Scheme ::include versioning-scheme.md:: # Create your first WebService Source: https://docs.servicestack.net/create-your-first-webservice This is a quick walkthrough of getting your first web service up and running whilst having a look at the how some of the different components work. ## Step 1: Install the x dotnet tool First we want to install the [x dotnet tool](/dotnet-tool): :::sh dotnet tool install --global x ::: The [dotnet tools](/dotnet-tool) are ServiceStack's versatile companion giving you quick access to a lot of its high-level features including generating mobile, web & desktop DTOs with [Add ServiceStack Reference](/add-servicestack-reference) generating [gRPC Clients and proto messages](/grpc/), quickly [apply gists](/mix-tool) to your project enabled by ServiceStack's effortless [no-touch Modular features](/modular-startup), [command-line API access](/post-command), it even includes a [lisp REPL](https://sharpscript.net/lisp/) should you need to explore your [remote .NET Apps in real-time](https://sharpscript.net/lisp/#techstacks-tcp-lisp-repl-demo). ## Step 2: Selecting a template Importantly, the dotnet tools lets you create [.NET 6, .NET Framework](/dotnet-new) and [ASP.NET Core on .NET Framework](/templates/corefx) projects. Unless you're restricted to working with .NET Framework you'll want to start with a [.NET 6 project template](/templates/dotnet-new#usage), for this example we'll start with the Empty [web](https://github.com/NetCoreTemplates/web) template which implicitly uses the folder name for the Project Name: :::sh x new web WebApp ::: ## Step 3: Run your project Press `Ctrl+F5` to run your project! You should see an already working API integration using [@servicestack/client](/javascript-client) library to call your App's [JavaScript DTOs](/javascript-add-servicestack-reference) and links to calling your API from [API Explorer](/api-explorer): #### Watched builds A recommended alternative to running your project from your IDE is to run a watched build using `dotnet watch` from a terminal: :::sh dotnet watch ::: Where it will automatically rebuild & restart your App when it detects any changes to your App's source files. ### How does it work? Now that your new project is running, let's have a look at what we have. The template comes with a single web service route which comes from the Request DTO (Data Transfer Object) which is located in the [Hello.cs](https://github.com/NetCoreTemplates/web/blob/master/MyApp.ServiceModel/Hello.cs) file: ```csharp [Route("/hello/{Name}")] public class Hello : IReturn { public string Name { get; set; } } public class HelloResponse { public string Result { get; set; } } ``` The `Route` attribute is specifying what path `/hello/{Name}` where `{Name}` binds its value to the public string property of **Name**. Let's access the route to see what comes back. Go to the following URL in your address bar: /hello/world You will see a snapshot of the Result in a HTML response format. To change the return format to Json, simply add `?format=json` to the end of the URL. You'll learn more about [formats](/formats), endpoints (URLs, etc) when you continue reading the documentation. If we go back to the solution and find the WebApplication1.ServiceInterface and open the **MyServices.cs** file, we can have a look at the code that is responding to the browser, giving us the **Result** back. ```csharp public class MyServices : Service { public object Any(Hello request) { return new HelloResponse { Result = $"Hello, {request.Name}!" }; } } ``` If we look at the code above, there are a few things to note. The name of the method `Any` means the server will run this method for any of the valid HTTP Verbs. Service methods are where you control what returns from your service. ## Step 4: Exploring the ServiceStack Solution The Recommended structure below is built into all ServiceStackVS VS.NET Templates where creating any new ServiceStack project will create a solution with a minimum of 4 projects below ensuring ServiceStack solutions starts off from an optimal logical project layout, laying the foundation for growing into a more maintainable, cohesive and reusable code-base: ### Host Project The Host project contains your AppHost which references and registers all your App's concrete dependencies in its IOC and is the central location where all App configuration and global behavior is maintained. It also references all Web Assets like Razor Views, JS, CSS, Images, Fonts, etc. that's needed to be deployed with the App. The AppHost is the top-level project which references all dependencies used by your App whose role is akin to an orchestrator and conduit where it decides what functionality is made available and which concrete implementations are used. By design it references all other (non-test) projects whilst nothing references it and as a goal should be kept free of any App or Business logic. ### ServiceInterface Project The ServiceInterface project is the implementation project where all Business Logic and Services live which typically references every other project except the Host projects. Small and Medium projects can maintain all their implementation here where logic can be grouped under feature folders. Large solutions can split this project into more manageable cohesive and modular projects which we also recommend encapsulates any dependencies they might use. ### ServiceModel Project The ServiceModel Project contains all your Application's DTOs which is what defines your Services contract, keeping them isolated from any Server implementation is how your Service is able to encapsulate its capabilities and make them available behind a remote facade. There should be only one ServiceModel project per solution which contains all your DTOs and should be implementation, dependency and logic-free which should only reference the impl/dep-free **ServiceStack.Interfaces.dll** contract assembly to ensure Service contracts are decoupled from its implementation, enforces interoperability ensuring that your Services don't mandate specific client implementations and will ensure this is the only project clients need to be able to call any of your Services by either referencing the **ServiceModel.dll** directly or downloading the DTOs from a remote ServiceStack instance using [Add ServiceStack Reference](/add-servicestack-reference): ![](/img/pages/dtos-role.png) ### Test Project The Unit Test project contains all your Unit and Integration tests. It's also a Host project that typically references all other non-Host projects in the solution and contains a combination of concrete and mock dependencies depending on what's being tested. See the [Testing Docs](/testing) for more information on testing ServiceStack projects. ## Learn ServiceStack Guide If you're new to ServiceStack we recommend stepping through [ServiceStack's Getting Started Guide](https://servicestack.net/start/project-overview) to get familiar with the basics. ## API Client Examples ### jQuery Ajax ServiceStack's clean Web Services makes it simple and intuitive to be able to call ServiceStack Services from any ajax client, e.g. from a traditional [Bootstrap Website using jQuery](https://github.com/ServiceStack/Templates/blob/master/src/ServiceStackVS/BootstrapWebApp/BootstrapWebApp/default.cshtml): ```html

``` ### Rich JsonApiClient & Typed DTOs The modern recommended alternative to jQuery that works in all modern browsers is using your APIs built-in [JavaScript typed DTOs](/javascript-add-servicestack-reference) with the [@servicestack/client](/javascript-client) library from a [JavaScript Module](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules). We recommend using an [importmap](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/script/type/importmap) to specify where **@servicestack/client** should be loaded from, e.g: ```html ``` This lets us reference the **@servicestack/client** package name in our source code instead of its physical location: ```html
``` ```html ``` ### Enable static analysis and intelli-sense For better IDE intelli-sense during development, save the annotated Typed DTOs to disk with the [x dotnet tool](/dotnet-tool): :::sh x mjs ::: Then reference it instead to enable IDE static analysis when calling Typed APIs from JavaScript: ```js import { Hello } from '/js/dtos.mjs' client.api(new Hello({ name })) ``` To also enable static analysis for **@servicestack/client**, install the dependency-free library as a dev dependency: :::sh npm install -D @servicestack/client ::: Where only its TypeScript definitions are used by the IDE during development to enable its type-checking and intelli-sense. ### Rich intelli-sense support Where you'll be able to benefit from rich intelli-sense support in smart IDEs like [Rider](https://www.jetbrains.com/rider/) for both the client library: ![](/img/pages/mix/init-rider-ts-client.png) As well as your App's server generated DTOs: ![](/img/pages/release-notes/v6.6/mjs-intellisense.png) So even simple Apps without complex bundling solutions or external dependencies can still benefit from a rich typed authoring experience without any additional build time or tooling complexity. ## Create Empty ServiceStack Apps ::include empty-projects.md:: ### Any TypeScript or JavaScript Web, Node.js or React Native App The same TypeScript [JsonServiceClient](/javascript-client) can also be used in more sophisticated JavaScript Apps like [React Native](/typescript-add-servicestack-reference#react-native-jsonserviceclient) to [Node.js Server Apps](https://github.com/ServiceStackApps/typescript-server-events) such as this example using TypeScript & [Vue Single-File Components](https://vuejs.org/guide/scaling-up/sfc.html): ```html ``` Compare and contrast with other major SPA JavaScript Frameworks: - [Vue 3 HelloApi.mjs](https://github.com/NetCoreTemplates/blazor-vue/blob/main/MyApp/wwwroot/posts/components/HelloApi.mjs) - [Vue SSG using swrClient](https://github.com/NetCoreTemplates/vue-ssg/blob/main/ui/src/components/HelloApi.vue) - [Next.js with swrClient](https://github.com/NetCoreTemplates/nextjs/blob/main/ui/components/intro.tsx) - [React HelloApi.tsx](https://github.com/NetCoreTemplates/react-spa/blob/master/MyApp/src/components/Home/HelloApi.tsx) - [Angular HelloApi.ts](https://github.com/NetCoreTemplates/angular-spa/blob/master/MyApp/src/app/home/HelloApi.ts) - [Svelte Home.svelte](https://github.com/NetCoreTemplates/svelte-spa/blob/master/MyApp/src/components/Home.svelte) ### Web, Mobile and Desktop Apps Use [Add ServiceStack Reference](/add-servicestack-reference) to enable typed integrations for the most popular languages to develop Web, Mobile & Desktop Apps. ### Full .NET Project Templates The above `init` projects allow you to create a minimal web app, to create a more complete ServiceStack App with the recommended project structure, start with one of our C# project templates instead: ### [C# Project Templates Overview](/templates/) ## Simple, Modern Razor Pages & MVC Vue 3 Tailwind Templates The new Tailwind Razor Pages & MVC Templates enable rapid development of Modern Tailwind Apps without the [pitfalls plaguing SPAs](https://servicestack.net/posts/javascript): All Vue Tailwind templates are pre-configured with our rich [Vue 3 Tailwind Components](/vue/) library for maximum productivity: ## Advanced JAMStack Templates For more sophisticated Apps that need the best web tooling that npm can offer checkout our JAMStack Vite Vue & SSG templates: Or if you prefer Modern React Apps checkout the Next.js template: For Blazor WASM and Server checkout our comprehensive [Blazor projects & Tailwind components](/templates/blazor-tailwind). ### Integrated in Major IDEs and popular Mobile & Desktop platforms ServiceStack Services are also [easily consumable from all major Mobile and Desktop platforms](/why-servicestack#generate-instant-typed-apis-from-within-all-major-ides) including native iPhone and iPad Apps on iOS with Swift, Mobile and Tablet Apps on Android with Java or Kotlin, OSX Desktop Applications as well as targeting the most popular .NET Mobile and Desktop platforms including Xamarin.iOS, Xamarin.Android, Windows Store, WPF and WinForms. ## Instant Client Apps Generate working native client apps for your live ServiceStack services, in a variety of languages, instantly with our free managed service. This tool enables your developers, and even your customers, to open a working example native application straight from the web to their favorite IDE. ## Fundamentals - AppHost and Configuration Walk through configuring your ServiceStack Application's `AppHost`: ## Community Resources - [Creating A Simple Service Using ServiceStack](https://www.c-sharpcorner.com/UploadFile/shashijeevan/creating-a-simple-service-using-servicestack779/) by [Shashi Jeevan](http://shashijeevan.net/author/shashijeevan/) - [Introducing ServiceStack](https://www.dotnetcurry.com/aspnet/1056/introducing-service-stack-tutorial) by [@dotnetcurry](https://twitter.com/DotNetCurry) - [Create web services in .NET in a snap with ServiceStack](https://www.techrepublic.com/article/create-web-services-in-net-in-a-snap-with-servicestack/) by [@techrepublic](https://twitter.com/techrepublic) - [How to build web services in MS.Net using ServiceStack](https://kborra.wordpress.com/2014/07/29/how-to-build-web-services-in-ms-net-using-service-stack/) by [@kishoreborra](http://kborra.wordpress.com/about/) - [Getting started with ServiceStack – Creating a service](https://dilanperera.wordpress.com/2014/02/22/getting-started-with-servicestack-creating-a-service/) - [ServiceStack Quick Start](https://debuggers.domains/post/servicestack-quick-start/) by [@aarondandy](https://github.com/aarondandy) - [Getting Started with ASP.NET MVC, ServiceStack and Bootstrap](https://www.pluralsight.com/courses/getting-started-aspdotnet-mvcservice-stack-bootstrap) by [@pluralsight](http://twitter.com/pluralsight) - [Building Web Applications with Open-Source Software on Windows](https://www.pluralsight.com/courses/building-web-application-open-source-software-on-windows) by [@pluralsight](http://twitter.com/pluralsight) - [ServiceStack the way I like it](https://www.antonydenyer.co.uk/2012-09-20-servicestack-the-way-i-like-it/) by [@tonydenyer](https://twitter.com/tonydenyer) - [Generating a RESTful Api and UI from a database with LLBLGen](https://www.mattjcowan.com/funcoding/2013/03/10/rest-api-with-llblgen-and-servicestack/) by [@mattjcowan](https://twitter.com/mattjcowan) - [ServiceStack: Reusing DTOs](https://korneliuk.blogspot.com/2012/08/servicestack-reusing-dtos.html) by [@korneliuk](https://twitter.com/korneliuk) - [ServiceStack, Rest Service and EasyHttp](https://blogs.lessthandot.com/index.php/WebDev/ServerProgramming/servicestack-restservice-and-easyhttp) by [@chrissie1](https://twitter.com/chrissie1) - [Building a Web API in SharePoint 2010 with ServiceStack](https://www.mattjcowan.com/funcoding/2012/05/04/building-a-web-api-in-sharepoint-2010-with-servicestack/) - [REST Raiding. ServiceStack](https://dgondotnet.blogspot.com/2012/04/rest-raiding-servicestack.html) by [Daniel Gonzalez](http://www.blogger.com/profile/13468563783321963413) - [JQueryMobile and ServiceStack: EventsManager tutorial](https://kylehodgson.com/2012/04/21/jquerymobile-and-service-stack-eventsmanager-tutorial-post-2/) / [Part 3](https://kylehodgson.com/2012/04/23/jquerymobile-and-service-stack-eventsmanager-tutorial-post-3/) by Kyle Hodgson - [Like WCF: Only cleaner!](https://kylehodgson.com/2012/04/18/like-wcf-only-cleaner-9/) by Kyle Hodgson - [ServiceStack I heart you. My conversion from WCF to SS](https://www.philliphaydon.com/2012/02/21/service-stack-i-heart-you-my-conversion-from-wcf-to-ss/) by [@philliphaydon](https://twitter.com/philliphaydon) - [ServiceStack vs WCF Data Services](https://codealoc.wordpress.com/2012/03/24/service-stack-vs-wcf-data-services/) - [Buildiing a Tridion WebService with jQuery and ServiceStack](https://www.curlette.com/?p=161) by [@robrtc](https://twitter.com/#!/robrtc) - [Anonymous type + Dynamic + ServiceStack == Consuming cloud has never been easier](https://www.ienablemuch.com/2012/05/anonymous-type-dynamic-servicestack.html) by [@ienablemuch](https://twitter.com/ienablemuch) - [Handful of examples of using ServiceStack based on the ServiceStack.Hello Tutorial](https://github.com/jfoshee/TryServiceStack) by [@82unpluggd](https://twitter.com/82unpluggd) # Your first Web Service Explained Source: https://docs.servicestack.net/your-first-webservice-explained Let's look a bit deeper into the [Hello World service](/create-your-first-webservice#how-does-it-work) you created: As you have seen, the convention for response DTO is `RequestDTO` and `RequestDTOResponse`. **Note, request and response DTO should be in the same namespace if you want ServiceStack to recognize the DTO pair**. To support automatic exception handling, you also need to add a `ResponseStatus` property to the response DTO: ```csharp // Request DTO public class Hello : IReturn { public string Name { get; set; } } // Response DTO (follows naming convention) public class HelloResponse { public string Result { get; set; } public ResponseStatus ResponseStatus { get; set; } //Automatic exception handling } ``` Services are implemented in a class that either inherits from the `Service` base class or implements the `IService` empty marker interface. Inheriting from the convenient `Service` base class provides easy access to the most common functionality. ```csharp public class HelloService : Service { public object Any(Hello request) { return new HelloResponse { Result = $"Hello, {request.Name}" }; } } ``` The above service can be called with **Any** HTTP Verb (e.g. GET, POST,..) from any endpoint or format (e.g. JSON, XML, etc). You can also choose to handle a specific Verb by changing the method name to suit. E.g. you can limit the Service to only handle HTTP **GET** requests by using the `Get` method: ```csharp public class HelloService : Service { public object Get(Hello request) => new HelloResponse { Result = $"Hello, {request.Name}" }; } ``` ## Calling Web Services Thanks to the above `IReturn` interface marker you'll be able to use the terse, typed Service Client APIs, e.g: ```csharp var client = new JsonApiClient(BaseUri); HelloResponse response = client.Get(new Hello { Name = "World" }); ``` Request DTOs that don't implement `IReturn` will need to explicitly specify the Response DTO on their call-site, e.g: ```csharp HelloResponse response = client.Get(new Hello { Name = "World" }); HelloResponse response = client.Get("/hello/World!"); ``` Alternatively you could use a general purpose HTTP Client like [HTTP Utils](/http-utils): ```csharp HelloResponse response = "http://base.url/hello/World" .GetJsonFromUrl() .FromJson(); ``` We highly recommend annotating Request DTO's with the above `IReturn` marker as it enables a generic typed API without clients having to know and specify the Response at each call-site, which would be invalidated and need to be manually updated if the Service Response Type changes. More details on the Service Clients is available on the [C#/.NET Service Clients page](/csharp-client). ### Registering Custom Routes If no routes are defined the .NET Service Clients will use the [pre-defined Routes](/routing#pre-defined-routes). You can annotate your Request DTO with the `[Route]` attribute to register additional Custom Routes, e.g: ```csharp //Request DTO [Route("/hello")] [Route("/hello/{Name}")] public class Hello : IReturn { public string Name { get; set; } } ``` The .NET ServiceClients will then use the best matching Route based on the populated properties on the Request DTO. ### Routing Tips No **?queryString** or POST Form Data should be included in the route as ServiceStack automatically populates Request DTOs with all matching params, e.g: ```csharp [Route("/hello")] ``` Matches both `/hello` and `/hello?name=World` with the latter populating the `Name` Request DTO **public property**. When the route includes a variable, e.g: ```csharp [Route("/hello/{Name}")] ``` It only matches: ``` /hello/name ``` Whereas using a wildcard: ```csharp [Route("/hello/{Name*}")] ``` Matches all routes: ``` /hello /hello/name /hello/my/name/is/ServiceStack ``` More details about Routing is available on the [Routing page](/routing). # ServiceStack API design Source: https://docs.servicestack.net/api-design The primary difference between developing RPC vs ServiceStack's [Message-based Services](/what-is-a-message-based-web-service) is that the Services entire contract is defined by its typed messages, specifically the Request DTO which defines both the System inputs and identifies the System output. Typically both are POCO DTOs however the [response can be any serializable object](/service-return-types). The simplest Service example that does this is: ```csharp public class MyRequest : IReturn {} public class MyServices : Service { public object Any(MyRequest request) => request; } ``` As only the `Any()` wildcard method is defined, it will get executed whenever the `MyRequest` Service is invoked via **any HTTP Verb**, [gRPC](/grpc/), [MQ](/messaging) or [SOAP](/soap-support) Request. The Request DTO is also all that's required to invoke it via any [Typed Generic Service Client](/clients-overview) in any supported language, e.g: ```csharp MyRequest response = client.Get(new MyRequest()); ``` All Services are accessible by their [pre-defined routes](/routing#pre-defined-routes), we can turn it into a functional data-driven Service by annotating it with a [user-defined route](/routing) and changing the implementation to return all App Contacts: ```csharp public class Contact { public int Id { get; set; } public string Name { get; set; } } [Route("/contacts")] public class GetContacts : IReturn> { } public class ContactsService : Service { public object Get(GetContacts request) => Db.Select(); } ``` Which your C# clients will still be able to call with: ```csharp List response = client.Get(new GetContacts()); ``` This will make a **GET** call to the custom `/contacts` URL and returns all rows from the `Contact` Table in the configured RDBMS using [OrmLite](https://github.com/ServiceStack/ServiceStack.OrmLite) `Select()` extension method on the `base.Db` ADO.NET `IDbConnection` property on ServiceStack's convenience `Service` base class. Using `Get()` limits access to this service from HTTP **GET** requests only, all other HTTP Verbs requests to `/contacts` will return a **404 NotFound** HTTP Error Response. ### Using explicit Response DTO Our recommendation instead of returning naked collections is returning an explicit predictable Response DTO, e.g: ```csharp [Route("/contacts")] public class GetContacts : IReturn { } public class GetContactsResponse { public List Results { get; set; } public ResponseStatus ResponseStatus { get; set; } } public class ContactsService : Service { public object Get(GetContacts request) => new GetContactsResponse { Results = Db.Select() }; } ``` Whilst slightly more verbose this style benefits from [more resilience in evolving and versioning](https://stackoverflow.com/a/12413091/85785) message-based Services and more coarse-grained APIs as additional results can be added to the Response DTO without breaking existing clients. You'll also need to follow the above convention if you also wanted to [support SOAP endpoints](/soap-support) or if you want to be able to handle Typed [Response Messages in MQ Services](/messaging#message-workflow). ### All APIs have a preferred default method Like the `Send*` APIs before them, both [API Explorer](/api-explorer) and the new [`Api*` methods](/csharp-client.html#high-level-api-and-apiasync-methods) send API requests using an APIs **preferred HTTP Method** which can be defined either: - Explicitly annotating Request DTOs with `IGet`, `IPost`, etc. **IVerb** interface markers - Using the verb specified in its user-defined `[Route]` attribute (if single verb specified) - Implicitly when using AutoQuery/CRUD Request DTOs - Using the Services **Verb()** implementation method if not using **Any()** If the HTTP Method can't be inferred, it defaults to using HTTP **POST**. But as good API documentation practice, we recommend specifying the HTTP Method each API should use, preferably using the `IVerb` interface marker, so it's embedded into the APIs Services Contract shared with clients (not required for AutoQuery APIs). ## ServiceStack's API Design We'll walk through a few examples here but for a more detailed look into the usages and capabilities of ServiceStack's API design checkout its [Comprehensive Test Suite](https://github.com/ServiceStack/ServiceStack/blob/master/tests/RazorRockstars.Console.Files/ReqStarsService.cs) At a minimum ServiceStack Services only need to implement the `IService` empty interface: ```csharp public interface IService {} ``` The interface is used as a Marker interface that ServiceStack uses to find, register and auto-wire your existing services. Although you're more likely going to want to inherit from ServiceStack's convenience concrete `Service` class which contains easy access to ServiceStack's providers: ```csharp public class Service : IService { IRequest Request { get; } // HTTP Request Context IResponse Response { get; } // HTTP Response Context IServiceGateway Gateway { get; } // Built-in Service Gateway IMessageProducer MessageProducer { get; } // Message Producer for Registered MQ Server void PublishMessage(T message); // Publish messages to Registered MQ Server IVirtualPathProvider VirtualFileSources { get; } // Virtual FileSystem Sources IVirtualFiles VirtualFiles { get; } // Writable Virtual FileSystem ICacheClient Cache { get; } // Registered Caching Provider ICacheClientAsync CacheAsync { get; } // Registered Async Caching Provider (or sync wrapper) MemoryCacheClient LocalCache { get; } // Local InMemory Caching Provider IDbConnection Db { get; } // Registered ADO.NET IDbConnection IRedisClient Redis { get; } // Registered RedisClient ValueTask GetRedisAsync(); // Registered Async RedisClient IAuthRepository AuthRepository { get; } // Registered User Repository IAuthRepositoryAsync AuthRepositoryAsync { get; } // Registered Async User Repository ISession SessionBag { get; } // Dynamic Session Bag ISessionAsync SessionBagAsync { get; } // Dynamic Async Session Bag Task SessionAsAsync(); // Resolve Typed UserSession Async TUserSession SessionAs(); // Resolve Typed UserSession IAuthSession GetSession() { get; } // Resolve base IAuthSession Task GetSessionAsync(); // Resolve base IAuthSession Async bool IsAuthenticated { get; } // Is Authenticated Request T TryResolve(); // Resolve dependency at runtime T ResolveService(); // Resolve an auto-wired service T GetPlugin(); // Resolve optional registered Plugin T AssertPlugin(); // Resolve required registered Plugin void Dispose(); // Override to implement custom IDispose ValueTask DisposeAsync(); // implement IAsyncDisposable (.NET v4.7.2+) } ``` ### Basic example - Handling Any HTTP Verb Lets revisit the Simple example from earlier: ```csharp [Route("/contacts")] public class GetContacts : IReturn> { } public class ContactsService : Service { public object Get(GetContacts request) => Db.Select(); } ``` ServiceStack maps HTTP Requests to your Services **Actions**. An Action is any method that: - Is `public` - Only contains a **single argument - the typed Request DTO** - Has a Method name matching a **HTTP Method** or **Any** (the fallback that can handle "ANY" method) - Methods can have **Format** suffix to handle specific formats, e.g. if exists `GetJson` will handle **GET JSON** requests - Can specify either `T` or `object` Return type, both have same behavior ### Content-Type Specific Service Implementations Service methods can also use `Verb{Format}` method names to provide a different implementation for handling a specific Content-Type. The Service below defines several different implementation for handling the same Request: ```csharp [Route("/my-request")] public class MyRequest { public string Name { get; set; } } public class ContentTypeServices : Service { public object GetJson(MyRequest request) => ..; // Handles GET /my-request for JSON responses public object GetHtml(MyRequest request) => // Handles GET /my-request for HTML Responses $@"

GetHtml {request.Name}

"; public object AnyHtml(MyRequest request) => // Handles other POST/PUT/etc Verbs for HTML Responses $@"

AnyHtml {request.Name}

"; public object Any(MyRequest request) => ...; // Handles all other unspecified Verbs/Formats } ``` ### Optional *Async Suffixes In addition your Services can optionally have the `*Async` suffix which by .NET Standard (and ServiceStack) guidelines is preferred for Async methods to telegraph to client call sites that its response should be awaited. ```csharp [Route("/contacts")] public class GetContacts : IReturn> { } public class ContactsService : Service { public async Task GetAsync(GetContacts request) => await Db.SelectAsync(); public object GetHtmlAsync(MyRequest request) => $@"

GetHtml {request.Name}

"; } ``` If both exists (e.g. `Post()` and `PostAsync()`) the `*Async` method will take precedence and be invoked instead. Allowing both is useful if you have internal services directly invoking other Services using `HostContext.ResolveService()` where you can upgrade your Service to use an Async implementation without breaking existing clients, e.g. this is used in [RegisterService.cs](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/Auth/RegisterService.cs): ```csharp [Obsolete("Use PostAsync")] public object Post(Register request) { try { var task = PostAsync(request); return task.GetResult(); } catch (Exception e) { throw e.UnwrapIfSingleException(); } } /// /// Create new Registration /// public async Task PostAsync(Register request) { //... async impl } ``` To change to use an async implementation whilst retaining backwards compatibility with existing call sites, e.g: ```csharp using var service = HostContext.ResolveService(Request); var response = service.Post(new Register { ... }); ``` This is important if the response is ignored as the C# compiler wont give you any hints to await the response which can lead to timing issues where the Services is invoked but User Registration hasn't completed as-is often assumed. Alternatively you can rename your method to use `*Async` suffix so the C# compiler will fail on call sites so you can replace the call-sites to `await` the async `Task` response, e.g: ```csharp using var service = HostContext.ResolveService(Request); var response = await service.PostAsync(new Register { ... }); ``` ### Group Services by Tag Related Services by can be grouped by annotating **Request DTOs** with the `[Tag]` attribute where they'll enable functionality in a number of ServiceStack's metadata services where they'll be used to [Group Services in Open API](https://swagger.io/docs/specification/grouping-operations-with-tags/). This feature could be used to tag which Services are used by different platforms: ```csharp [Tag("web")] public class WebApi : IReturn {} [Tag("mobile")] public class MobileApi : IReturn {} [Tag("web"),Tag("mobile")] public class WebAndMobileApi : IReturn {} ``` Where they'll appear as a tab to additionally filter APIs in metadata pages: ![](/img/pages/metadata/tag-groups.webp) They're also supported in [Add ServiceStack Reference](/add-servicestack-reference) where it can be used in the [IncludeTypes](/csharp-add-servicestack-reference#includetypes) DTO customization option where tags can be specified using braces in the format `{tag}` or `{tag1,tag2,tag3}`, e.g: ``` /* Options: IncludeTypes: {web,mobile} ``` Or individually: ``` /* Options: IncludeTypes: {web},{mobile} ``` It works similar to [Dependent Type References wildcard syntax](/csharp-add-servicestack-reference#include-request-dto-and-its-dependent-types) where it expands all Request DTOs with the tag to include all its reference types so including a `{web}` tag would be equivalent to including all Request DTOs & reference types with that reference, e.g: ``` /* Options: IncludeTypes: WebApi.*,WebAndMobileApi.* ``` ### Micro ORMs and ADO.NET's IDbConnection Code-First Micro ORMS like [OrmLite](https://github.com/ServiceStack/ServiceStack.OrmLite) and [Dapper](https://github.com/StackExchange/Dapper) provides a pleasant high-level experience whilst working directly against ADO.NET's low-level `IDbConnection`. They both support all major databases so you immediately have access to a flexible RDBMS option out-of-the-box. At the same time you're not limited to using the providers contained in the `Service` class and can continue to use your own register IOC dependencies (inc. an alternate IOC itself). ### Micro ORM POCOs make good DTOs The POCOs used in Micro ORMS are particularly well suited for re-using as DTOs since they don't contain any circular references that the Heavy ORMs have (e.g. EF). OrmLite goes 1-step further and borrows pages from NoSQL's playbook where any complex property e.g. `List` is transparently blobbed in a schema-less text field, promoting the design of frictionless **Pure POCOS** that are uninhibited by RDBMS concerns. In many cases these POCO data models already make good DTOs and can be returned directly instead of mapping to domain-specific DTOs. ### Calling Services from a Typed C# Client In Service development your services DTOs provides your technology agnostic **Service Layer** which you want to keep clean and as 'dependency-free' for maximum accessibility and potential re-use. Our recommendation is to follow our [Recommended Physical Project Structure](/physical-project-structure) and keep your DTOs in a separate ServiceModel project which ensures a well-defined ServiceContract [decoupled from their implementation and accessible from any client](/service-complexity-and-dto-roles#data-transfer-objects---dtos). This recommended Physical project structure is embedded in each [ServiceStack VS.NET Template](/templates/). One of ServiceStack's strengths is its ability to re-use your Server DTOs on the client enabling ServiceStack's productive end-to-end typed API. ServiceStack's use of Typed DTOs in its message-based design enable greater resiliency for your Services where the exact DTOs aren't needed, only the shape of the DTOs is important and clients can also opt to use partial DTOs containing just the fields they're interested in. In the same way extending existing Services with new optional properties wont break existing clients using older DTOs. When developing both Server and Client applications the easiest way to call typed Services from clients is to just have them reference the same ServiceModel .dll the Server uses to define its Service Contract, or for clients that only need to call a couple of Service you can choose to instead copy the class definitions as-is, in both cases calling Services is exactly the same where the Request DTO can be used with any of the generic [C#/.NET Service Clients](/csharp-client) to call Services using a succinct typed API, e.g: #### Service Model Classes ```csharp [Route("/contacts")] public class GetContacts : IReturn> { } public class Contact { ... } ``` Which can used in any ServiceClient with: ```csharp var client = new JsonApiClient(BaseUri); List response = client.Get(new GetContacts()); ``` Which makes a **GET** web request to the `/contacts` route. Custom Routes on Request DTO's are also not required as when none are defined the client automatically falls back to using ServiceStack's [pre-defined routes](/routing#pre-defined-routes). ### Generating Typed DTOs In addition to being able to share your `ServiceModel.dll` on .NET Clients enable a typed end-to-end API without code-gen, clients can alternatively choose to use [Add ServiceStack Reference](/csharp-add-servicestack-reference) support to provide an alternative way to get the Services typed DTOs on the client. In both cases the exact same source code is used to call the Services: ```csharp var client = new JsonApiClient(BaseUri); var response = client.Get(new GetContacts()); ``` Add ServiceStack Reference is also available for [most popular languages](/add-servicestack-reference) used in developing Web, Mobile and Desktop Apps. #### Custom API Requests When preferred, you can also use the previous more explicit client API (ideal for when you don't have the `IReturn<>` marker) which lets you call the Service using just its route: ```csharp var response = client.Get>("/contacts"); ``` ::: info All these Service Client APIs **have async equivalents** with an `*Async` suffix ::: ### API QueryParams ServiceStack's message-based design is centered around sending a single message which is all that's required to invoke any Typed API, however there may be times when you need to send additional params where you can't change the API's Request DTO definition or in AutoQuery's case its [Implicit Conventions](/autoquery/rdbms#implicit-conventions) would require too many permutations to be able to type the entire surface area on each Request DTO. Typically this would inhibit being able to invoke these Services from a typed Service Client API that would instead need to either use the untyped [`Get(relativeUrl)`](https://reference.servicestack.net/api/ServiceStack/IRestClient/#-gettresponsestring) ServiceClient APIs or [HTTP Utils](/http-utils) to construct the API Request path manually. Alternatively Request DTOs can implement `IHasQueryParams` where any entries will be sent as additional query params along with the typed DTO: ```csharp public interface IHasQueryParams { Dictionary QueryParams { get; set; } } ``` Which is available in all AutoQuery DTOs where it's added as a non-serializable property so it's only included in the QueryString: ```csharp [DataContract] public abstract class QueryBase : IQuery, IHasQueryParams { //... [IgnoreDataMember] public virtual Dictionary QueryParams { get; set; } } ``` Which allows using existing ServiceClient typed APIs to send a combination of untyped queries in AutoQuery requests, e.g: ```csharp var api = await client.ApiAsync(new QueryContacts { IdsIn = new[]{ 1, 2, 3 }, QueryParams = new() { ["LastNameStartsWith"] = "A" } }); ``` ## Everything centered around Request DTOs A nice property of ServiceStack's message-based design is all functionality is centered around Typed Request DTOs which easily lets you take advantage of high-level value-added functionality like [Auto Batched Requests](/auto-batched-requests) or [Encrypted Messaging](/auth/encrypted-messaging) which are enabled automatically without any effort or easily opt-in to enhanced functionality by decorating Request DTOs or thier Services with Metadata and [Filter Attributes](/filter-attributes) and everything works together, binded against typed models naturally. E.g. you can take advantage of [ServiceStack's Razor support](https://razor.netcore.io/) and create a web page for this service by just adding a Razor view with the same name as the Request DTO in the `/Views` folder, which for the `GetContacts` Request DTO you can just add `/Views/GetContacts.cshtml` and it will get rendered with the Services Response DTO as its View Model when the Service is called from a browser (i.e. HTTP Request with `Accept: text/html`). Thanks to ServiceStack's built-in Content Negotiation you can fetch the HTML contents calling the same url: ```csharp var html = $"{BaseUri}/contacts".GetStringFromUrl(accept:"text/html"); ``` This [feature is particularly nice](https://razor.netcore.io/#unified-stack) as it lets you **re-use your existing services** to serve both Web and Native Mobile and Desktop clients. ### Action Filters Service actions can also contain fine-grained application of Request and Response filters, e.g: ```csharp public class ContactsService : Service { [ClientCanSwapTemplates] public object Get(GetContacts request) => Db.Select(); } ``` This Request Filter allows the client to [change the selected Razor **View** and **Template**](https://razor.netcore.io/#unified-stack) used at runtime. By default the view with the same name as the **Request** or **Response** DTO is used. ## Handling different HTTP Verbs ServiceStack Services lets you handle any HTTP Verb in the same way, e.g this lets you respond with CORS headers to a HTTP **OPTIONS** request with: ```csharp public class ContactsService : Service { [EnableCors] public void Options(GetContact request) {} } ``` Which if you now make an OPTIONS request to the above service, will emit the default `[EnableCors]` headers: ```csharp var webReq = (HttpWebRequest)WebRequest.Create(Host + "/contacts"); webReq.Method = "OPTIONS"; using var webRes = webReq.GetResponse(); webRes.Headers["Access-Control-Allow-Origin"] // * webRes.Headers["Access-Control-Allow-Methods"] // GET, POST, PUT, DELETE, OPTIONS webRes.Headers["Access-Control-Allow-Headers"] // Content-Type ``` ### PATCH request example Handling a PATCH request is just as easy, e.g. here's an example of using PATCH to handle a partial update of a Resource: ```csharp [Route("/contacts/{Id}", "PATCH")] public class UpdateContact : IReturn { public int Id { get; set; } public int Age { get; set; } } public Contact Patch(UpdateContact request) { var Contact = request.ConvertTo(); Db.UpdateNonDefaults(Contact); return Db.SingleById(request.Id); } ``` And the client call is just as easy as you would expect: ```csharp var response = client.Patch(new UpdateContact { Id = 1, Age = 18 }); ``` Although sending different HTTP Verbs are unrestricted in native clients, they're unfortunately not allowed in some web browsers and proxies. So to simulate a PATCH from an AJAX request you need to set the **X-Http-Method-Override** HTTP Header. ## Structured Error Handling When following the [explicit Response DTO Naming convention](/error-handling#error-response-types) ServiceStack will automatically populate the `ResponseStatus` property with a structured Error Response otherwise if returning other DTOs like naked collections ServiceStack will instead return a generic `ErrorResponse`, although this is mostly a transparent technical detail you don't need to know about as for schema-less formats like JSON they return the exact same wire-format. [Error Handling](/error-handling) works naturally in ServiceStack where you can simply throw C# Exceptions, e.g: ```csharp public List Post(Contact request) { if (!request.Age.HasValue) throw new ArgumentException("Age is required"); Db.Insert(request.ConvertTo()); return Db.Select(); } ``` This will result in an Error thrown on the client if it tried to create an empty Contact: ```csharp try { var response = client.Post(new Contact()); } catch (WebServiceException webEx) { webEx.StatusCode // 400 webEx.StatusDescription // ArgumentException webEx.ResponseStatus.ErrorCode // ArgumentException webEx.ResponseStatus.Message // Age is required webEx.ResponseDto is ErrorResponse // true } ``` The same Service Clients Exception handling is also used to handle any HTTP error generated in or outside of your service, e.g. here's how to detect if a HTTP Method isn't implemented or disallowed: ```csharp try { var response = client.Send(new SearchContacts()); } catch (WebServiceException webEx) { webEx.StatusCode // 405 webEx.StatusDescription // Method Not Allowed } ``` In addition to standard C# exceptions your services can also return multiple, rich and detailed validation errors as enforced by [Fluent Validation's validators](/validation). ### Overriding the default Exception handling You can override the default exception handling in ServiceStack by registering a `ServiceExceptionHandlers`, e.g: ```csharp void Configure(Container container) { this.ServiceExceptionHandlers.Add((req, reqDto, ex) => { return ...; }); } ``` ## Smart Routing For the most part you won't need to know about this as ServiceStack's routing works as you would expect. Although this should still serve as a good reference to describe the resolution order of ServiceStack's Routes: 1. Any exact Literal Matches are used first 2. Exact Verb match is preferred over All Verbs 3. The more variables in your route the less weighting it has 4. When Routes have the same weight, the order is determined by the position of the Action in the service or Order of Registration (FIFO) These Rules only come into play when there are multiple routes that matches the pathInfo of an incoming request. Lets see some examples of these rules in action using the routes defined in the [API Design test suite](https://github.com/ServiceStack/ServiceStack/blob/master/tests/RazorRockstars.Console.Files/ReqStarsService.cs): ```csharp [Route("/contacts")] public class Contact {} [Route("/contacts", "GET")] public class GetContacts {} [Route("/contacts/{Id}", "GET")] public class GetContact {} [Route("/contacts/{Id}/{Field}")] public class ViewContact {} [Route("/contacts/{Id}/delete")] public class DeleteContact {} [Route("/contacts/{Id}", "PATCH")] public class UpdateContact {} [Route("/contacts/reset")] public class ResetContact {} [Route("/contacts/search")] [Route("/contacts/aged/{Age}")] public class SearchContacts {} ``` These are results for these HTTP Requests ``` GET /contacts => GetContacts POST /contacts => Contact GET /contacts/search => SearchContacts GET /contacts/reset => ResetContact PATCH /contacts/reset => ResetContact PATCH /contacts/1 => UpdateContact GET /contacts/1 => GetContact GET /contacts/1/delete => DeleteContact GET /contacts/1/foo => ViewContact ``` And if there were multiple of the exact same routes declared like: ```csharp [Route("/req/{Id}", "GET")] public class Req2 {} [Route("/req/{Id}", "GET")] public class Req1 {} public class MyService : Service { public object Get(Req1 request) { ... } public object Get(Req2 request) { ... } } ``` The Route on the Action that was declared first gets selected, i.e: ``` GET /req/1 => Req1 ``` ### Populating Complex Type Properties on QueryString ServiceStack uses the [JSV-Format](/jsv-format) (JSON without quotes) to parse QueryStrings. JSV lets you embed deep object graphs in QueryString as seen [this example url](https://test.servicestack.net/json/reply/StoreLogs?Loggers=%5B%7BId:786,Devices:%5B%7BId:5955,Type:Panel,TimeStamp:1199303309,Channels:%5B%7BName:Temperature,Value:58%7D,%7BName:Status,Value:On%7D%5D%7D,%7BId:5956,Type:Tank,TimeStamp:1199303309,Channels:%5B%7BName:Volume,Value:10035%7D,%7BName:Status,Value:Full%7D%5D%7D%5D%7D%5D): ``` https://test.servicestack.net/json/reply/StoreLogs?Loggers=[{Id:786,Devices:[{Id:5955,Type:Panel, Channels:[{Name:Temperature,Value:58},{Name:Status,Value:On}]}, {Id:5956,Type:Tank,TimeStamp:1199303309, Channels:[{Name:Volume,Value:10035},{Name:Status,Value:Full}]}]}] ``` ## Advanced Usages ### Custom Hooks The ability to extend ServiceStack's service execution pipeline with Custom Hooks is an advanced customization feature that for most times is not needed as the preferred way to add composable functionality to your services is to use [Request / Response Filter attributes](/filter-attributes) or apply them globally with [Global Request/Response Filters](/request-and-response-filters). ### Custom Serialized Responses The new `IHttpResult.ResultScope` API provides an opportunity to execute serialization within a custom scope, e.g. this can be used to customize the serialized response of adhoc services that's different from the default global configuration with: ```csharp return new HttpResult(dto) { ResultScope = () => JsConfig.With(new Config { IncludeNullValues = true }) }; ``` Which enables custom serialization behavior by performing the serialization within the custom scope, equivalent to: ```csharp using (JsConfig.With(new Config { IncludeNullValues = true })) { var customSerializedResponse = Serialize(dto); } ``` ### Request and Response Converters The [Encrypted Messaging Feature](/auth/encrypted-messaging) takes advantage of Request and Response Converters that let you change the Request DTO and Response DTO's that get used in ServiceStack's Request Pipeline where: #### Request Converters Request Converters are executed directly after any [Custom Request Binders](/serialization-deserialization#create-a-custom-request-dto-binder): ```csharp appHost.RequestConverters.Add(async (req, requestDto) => { //Return alternative Request DTO or null to retain existing DTO }); ``` #### Response Converters Response Converters are executed directly after the Service: ```csharp appHost.ResponseConverters.Add(async (req, response) => //Return alternative Response or null to retain existing Service response }); ``` ### Intercept Service Requests As an alternative to creating a [Custom Service Runner](#using-a-custom-servicerunner) to intercept different events when processing ServiceStack Requests, you can instead override the `OnBeforeExecute()`, `OnAfterExecute()` and `OnExceptionAsync()` callbacks in your `Service` class (or base class) to intercept and modify Request DTOs, Responses or Error Responses, e.g: ```csharp class MyServices : Service { // Log all Request DTOs that implement IHasSessionId public override void OnBeforeExecute(object requestDto) { if (requestDto is IHasSessionId dtoSession) { Log.Debug($"{nameof(OnBeforeExecute)}: {dtoSession.SessionId}"); } } //Return Response DTO Name in HTTP Header with Response public override object OnAfterExecute(object response) { return new HttpResult(response) { Headers = { ["X-Response"] = response.GetType().Name } }; } //Return custom error with additional metadata public override Task OnExceptionAsync(object requestDto, Exception ex) { var error = DtoUtils.CreateErrorResponse(requestDto, ex); if (error is IHttpError httpError) { var errorStatus = httpError.Response.GetResponseStatus(); errorStatus.Meta = new Dictionary { ["InnerType"] = ex.InnerException?.GetType().Name }; } return Task.FromResult(error); } } ``` #### Async Callbacks For async callbacks your Services can implement `IServiceBeforeFilterAsync` and `IServiceAfterFilterAsync`, e.g: ```csharp public class MyServices : Service, IServiceBeforeFilterAsync, IServiceAfterFilterAsync { public async Task OnBeforeExecuteAsync(object requestDto) { //... } public async Task OnAfterExecuteAsync(object response) { //... return response; } } ``` If you're implementing `IService` instead of inheriting the concrete `Service` class, you can implement the interfaces directly: ```csharp // Handle all callbacks public class MyServices : IService, IServiceFilters { //.. } // Or individually, just the callbacks you want public class MyServices : IService, IServiceBeforeFilter, IServiceAfterFilter, IServiceErrorFilter { //.. } ``` ### Custom Service Runner The [IServiceRunner](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Web/IServiceRunner.cs) decouples the execution of your service from the implementation of it which provides an alternative custom hook which lets you add custom behavior to all Services without needing to use a base Service class. To add your own Service Hooks you just need to override the default Service Runner in your AppHost from its default implementation: ```csharp public virtual IServiceRunner CreateServiceRunner(ActionContext actionContext) { return new ServiceRunner(this, actionContext); //Cached per Service Action } ``` With your own: ```csharp public override IServiceRunner CreateServiceRunner(ActionContext actionContext) { return new MyServiceRunner(this, actionContext); //Cached per Service Action } ``` Where `MyServiceRunner` is just a custom class implementing the custom hooks you're interested in, e.g: ```csharp public class MyServiceRunner : ServiceRunner { public override OnBeforeExecute(IRequest req, TRequest request, object service) { // Called just before any Action is executed } public override Task ExecuteAsync(IRequest req, object instance, TRequest requestDto) { // Called to execute the Service instance with the requestDto return base.ExecuteAsync(req, serviceInstance, requestDto); } public override object OnAfterExecute(IRequest req, object response, object service) { // Called just after any Action is executed, you can modify the response returned here as well } public override Task HandleExceptionAsync(IRequest req, TRequest requestDto, Exception ex, object instance) { // Called whenever an exception is thrown in your Services Action } } ``` ## Limitations One limitation of Services is that you can't split the handling of a single Resource (i.e. Request DTO) over multiple service implementations. If you find you need to do this because your service is getting too big, consider using partial classes to spread the implementation over multiple files. Another option is encapsulating some of the re-usable functionality into Logic dependencies and inject them into your service. ## Other Notes Although they're not needed or used anywhere [you can also use HTTP Verb interfaces](https://github.com/ServiceStack/ServiceStack/blob/34acc429ee04053ea766e4fb183e7aad7321ef5e/src/ServiceStack.Interfaces/IService.cs#L27) to enforce the correct signature required by the services, e.g: ```csharp public class MyService : Service, IAny, IGet, IPost { public object Any(GetContacts request) { .. } public object Get(SearchContacts request) { .. } public object Post(Contact request) { .. } } ``` This has no effect to the runtime behaviour and your services will work the same way with or without the added interfaces. # Service Return Types Source: https://docs.servicestack.net/service-return-types From a birds-eye view ServiceStack can return any of: - Any **DTO** object -> serialized to Response ContentType - `HttpResult`, `HttpError`, `CompressedResult` or other `IHttpResult` for Customized HTTP response #### Services should only return Reference Types If a Value Type like `int` or `long` response is needed, it's recommended to wrap the Value Type in a Response DTO, e.g: ```csharp public class MyResponse { public int Result { get; set; } } ``` Alternatively you can return a naked Value Type response by returning it as a `string`, e.g: ```csharp public object Any(MyRequest request) => "1"; ``` ## Different Return Types The following types are not converted (to different Content-Types) but get written directly to the Response Stream: - `String` - `Stream` - `IStreamWriter` - `byte[]` - with the `application/octet-stream` Content Type - `ReadOnlyMemory` - `ReadOnlyMemory` From the [HelloWorld ServiceStack.UseCase](https://github.com/ServiceStack/ServiceStack.UseCases/blob/master/HelloWorld/Global.asax.cs) demo: ```csharp public class HelloService : Service { public HelloResponse Get(Hello request) { return new HelloResponse { Result = $"Hello, {request.Name}!" }; //C# client can call with: //var response = client.Get(new Hello { Name = "ServiceStack" }); } public string Get(HelloHtml request) { return $"

Hello, {request.Name}!

"; } [AddHeader(ContentType = "text/plain")] public string Get(HelloText request) { return $"

Hello, {request.Name}!

"; } [AddHeader(ContentType = "image/png")] public Stream Get(HelloImage request) { var width = request.Width.GetValueOrDefault(640); var height = request.Height.GetValueOrDefault(360); var bgColor = request.Background != null ? Color.FromName(request.Background) : Color.ForestGreen; var fgColor = request.Foreground != null ? Color.FromName(request.Foreground) : Color.White; var image = new Bitmap(width, height); using (var g = Graphics.FromImage(image)) { g.Clear(bgColor); var drawString = $"Hello, {request.Name}!"; var drawFont = new Font("Times", request.FontSize.GetValueOrDefault(40)); var drawBrush = new SolidBrush(fgColor); var drawRect = new RectangleF(0, 0, width, height); var drawFormat = new StringFormat { LineAlignment = StringAlignment.Center, Alignment = StringAlignment.Center }; g.DrawString(drawString, drawFont, drawBrush, drawRect, drawFormat); var ms = new MemoryStream(); image.Save(ms, ImageFormat.Png); return ms; } } } ``` #### Live Examples of the above Hello Service: - [/hello/ServiceStack](http://bootstrapapi.apphb.com/api/hello/ServiceStack) - [/hello/ServiceStack?format=json](http://bootstrapapi.apphb.com/api/hello/ServiceStack?format=json) - [/hellotext/ServiceStack](http://bootstrapapi.apphb.com/api/hellotext/ServiceStack) - [/hellohtml/ServiceStack](http://bootstrapapi.apphb.com/api/hellohtml/ServiceStack) - [/helloimage/ServiceStack?Width=600&height=300&Foreground=Yellow](http://bootstrapapi.apphb.com/api/helloimage/ServiceStack?Width=600&height=300&Foreground=Yellow) ### Content-Type Specific Service Implementations Service implementations can use `Verb{Format}` method names to provide a different implementation for handling a specific Content-Type, e.g. the Service below defines several different implementation for handling the same Request: ```csharp [Route("/my-request")] public class MyRequest { public string Name { get; set; } } public class ContentTypeServices : Service { // Handles all other unspecified Verbs/Formats to /my-request public object Any(MyRequest request) => ...; // Handles GET /my-request for JSON responses public object GetJson(MyRequest request) => ..; // Handles POST/PUT/DELETE/etc /my-request for HTML Responses public object AnyHtml(MyRequest request) => $@"

AnyHtml {request.Name}

"; // Handles GET /my-request for HTML Responses public object GetHtml(MyRequest request) => $@"

GetHtml {request.Name}

"; } ``` This convention can be used for any of the formats listed in `ContentTypes.KnownFormats`, which by default includes: - json - xml - jsv - csv - html - protobuf - msgpack - wire ## Partial Content Support Partial Content Support allows a resource to be split up an accessed in multiple chunks for clients that support HTTP Range Requests. This is a popular feature in download managers for resuming downloads of large files and streaming services for real-time streaming of content (e.g. consumed whilst it's being watched or listened to). [HTTP Partial Content Support](http://benramsey.com/blog/2008/05/206-partial-content-and-range-requests/) is added in true ServiceStack-style where it's now automatically and transparently enabled for any existing services returning: #### A Physical File ```csharp return new HttpResult(new FileInfo(filePath), request.MimeType); ``` #### A Virtual File ```csharp return new HttpResult(VirtualFileSources.GetFile(virtualPath)); ``` #### A Memory Stream ```csharp return new HttpResult(ms, "audio/mpeg"); ``` #### Raw Bytes ```csharp return new HttpResult(bytes, "image/png"); ``` #### Raw Text ```csharp return new HttpResult(customText, "text/plain"); ``` Partial Content was also added to static file downloads served directly through ServiceStack which lets you stream mp3 downloads or should you ever want to your static .html, .css, .js, etc. You can disable Partial Content support with `Config.AllowPartialResponses = false;`. See the [PartialContentResultTests](https://github.com/ServiceStack/ServiceStack/blob/master/tests/ServiceStack.WebHost.Endpoints.Tests/PartialContentResultTests.cs) for more examples. ## Writing directly to the Response Stream In addition to returning plain C# objects, ServiceStack allows you to return any **Stream** or `IStreamWriterAsync` (which is a bit more flexible on how you write to the response stream): ```csharp public interface IStreamWriterAsync { Task WriteToAsync(Stream responseStream, CancellationToken token=default); } ``` Both though allow you to write directly to the Response OutputStream without any additional conversion overhead. ### Customizing HTTP Headers If you want to customize the HTTP headers at the same time you just need to implement [IHasOptions](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack.Interfaces/Web/IHasOptions.cs) where any Dictionary Entry is written to the Response HttpHeaders. ```csharp public interface IHasOptions { IDictionary Options { get; } } ``` Further than that, the IHttpResult allows even finer-grain control of the HTTP output (status code, headers, ...) where you can supply a custom Http Response status code. You can refer to the implementation of the [HttpResult](https://github.com/ServiceStack/ServiceStack/blob/master/src/ServiceStack/HttpResult.cs) object for a real-world implementation of these above interfaces. ### Further customizing the HTTP Response See the [Customize HTTP Responses](/customize-http-responses) page for more ways of customizing the HTTP Response. # Design RESTful Services Source: https://docs.servicestack.net/design-rest-services ServiceStack encourages a message-based design so each Service should have its own distinct message (aka Request DTO) where it's able to use explicit properties to define what each Service accepts. Something to keep in mind is how you define and design your Services in ServiceStack are de-coupled in how you expose them which can be exposed under any custom Route. ### Use a logical / hierarchical Url structure We recommend adopting a logical hierarchically structured URL that represents the identifier of a resource, i.e. the parent path categorizes your resource and gives it meaningful context. So if you needed to design an API for System that maintained **Events** and their **Reviews** it could adopt the following url structure: ``` /events # all events /events/1 # event #1 /events/1/reviews # event #1 reviews ``` Where each of the above resource identifiers can be invoked using any HTTP **Verb** which represents the action to take on them, e.g: ``` GET /events # View all Events POST /events # Create a new Event PUT /events/{Id} # Update an existing Event DELETE /events/{Id} # Delete an existing Event ``` ### Implementing RESTful Routes For their implementation ServiceStack encourages a message-based design that groups all related operations based on **Response type** and **Call Context**. For an Events and Reviews system it could look something like: ```csharp [Route("/events", "GET")] [Route("/events/category/{Category}", "GET")] // Optional GET example public class SearchEvents : IReturn> { //resultset filter examples, e.g. ?Category=Tech&Query=servicestack public string Category { get; set; } public string Query { get; set; } } [Route("/events", "POST")] public class CreateEvent : IReturn { public string Name { get; set; } public DateTime StartDate { get; set; } } [Route("/events/{Id}", "GET")] [Route("/events/code/{EventCode}", "GET")] // Alternative Id public class GetEvent : IReturn { public int Id { get; set; } public string EventCode { get; set; } // Alternative to fetch Events } [Route("/events/{Id}", "PUT")] public class UpdateEvent : IReturnVoid { public int Id { get; set; } public string Name { get; set; } public DateTime StartDate { get; set; } } ``` Event Reviews would follow a similar pattern: ```csharp [Route("/events/{EventId}/reviews", "GET")] public class GetEventReviews : IReturn> { public int EventId { get; set; } } [Route("/events/{EventId}/reviews/{Id}", "GET")] public class GetEventReview : IReturn { public int EventId { get; set; } public int Id { get; set; } } [Route("/events/{EventId}/reviews", "POST")] public class CreateEventReview : IReturn { public int EventId { get; set; } public string Comments { get; set; } } ``` The above REST Service examples returns naked Types and collections which [ServiceStack has a great story for](/api-design#structured-error-handling), however our personal preference is to design more coarse-grained and versionable [Message-based APIs](/design-message-based-apis) where we'd use an explicit Response DTO for each Service, e.g: ```csharp [Route("/events/{EventId}/reviews", "GET")] public class GetEventReviews : IReturn { public int EventId { get; set; } } public class GetEventReviewsResponse { public List Results { get; set; } } [Route("/events/{EventId}/reviews/{Id}", "GET")] public class GetEventReview : IReturn { public int EventId { get; set; } public int Id { get; set; } } public class GetEventReviewResponse { public EventReview Result { get; set; } public ResponseStatus ResponseStatus { get; set; } // inject structured errors if any } [Route("/events/{EventId}/reviews", "POST")] public class CreateEventReview : IReturn { public int EventId { get; set; } public string Comments { get; set; } } public class CreateEventReviewResponse { public EventReview Result { get; set; } public ResponseStatus ResponseStatus { get; set; } } ``` ### Notes The implementation of each Services then becomes straight-forward based on these messages, which (depending on code-base size) we'd recommend organizing in 2 **EventsService** and **EventReviewsService** classes. Although `UpdateEvent` and `CreateEvent` are seperate Services here, if the use-case permits they can instead be handled by a single idempotent `StoreEvent` Service. ## [Physical Project Structure](/physical-project-structure) Ideally the root-level **AppHost** project should be kept lightweight and implementation-free. Although for small projects or prototypes with only a few services it's ok for everything to be in a single project and to simply grow your architecture when and as needed. For medium-to-large projects we recommend the physical structure below which for the purposes of this example we'll assume our Application is called **Events**. The order of the projects also show its dependencies, e.g. the top-level `Events` project references **all** sub projects whilst the last `Events.ServiceModel` project references **none**: ``` /Events AppHost.cs // ServiceStack Web or Self Host Project /Events.ServiceInterface // Service implementations (akin to MVC Controllers) EventsService.cs EventsReviewsService.cs /Events.Logic // For large projects: extract C# logic, data models, etc IGoogleCalendarGateway // E.g of a external dependency this project could use /Events.ServiceModel // Service Request/Response DTOs and DTO types Events.cs // SearchEvents, CreateEvent, GetEvent DTOs EventReviews.cs // GetEventReviews, CreateEventReview Types/ Event.cs // Event type EventReview.cs // EventReview type ``` With the `Events.ServiceModel` DTO's kept in their own separate implementation and dependency-free dll, you're freely able to share this dll in any .NET client project as-is - which you can use with any of the generic [C# Service Clients](/csharp-server-events-client) to provide an end-to-end typed API without any code-gen. ## More Info - This recommended project structure is embedded in all [ServiceStackVS VS.NET Templates](/templates/). - The [Simple Customer REST Example](/why-servicestack#simple-customer-database-rest-services-example) is a small self-contained, real-world example of creating a simple REST Service utilizing an RDBMS. # Design Message-based APIs Source: https://docs.servicestack.net/design-message-based-apis To give you a flavor of the differences you should think about when designing message-based services in ServiceStack we'll look at some examples to contrast WCF/WebApi vs ServiceStack's approach: ## WCF vs ServiceStack API Design WCF encourages you to think of web services as normal C# method calls, e.g: ```csharp public interface IWcfCustomerService { Customer GetCustomerById(int id); List GetCustomerByIds(int[] id); Customer GetCustomerByUserName(string userName); List GetCustomerByUserNames(string[] userNames); Customer GetCustomerByEmail(string email); List GetCustomerByEmails(string[] emails); } ``` This is what the same Service contract would look like in ServiceStack: ```csharp public class Customers : IReturn> { public int[] Ids { get; set; } public string[] UserNames { get; set; } public string[] Emails { get; set; } } ``` The important concept to keep in mind is that the entire query (aka Request) is captured in the Request Message (i.e. Request DTO) and not in the server method signatures. The obvious immediate benefit of adopting a message-based design is that any combination of the above RPC calls can be fulfilled in 1 remote message, by a single service implementation which improves cacheability and simplifies maintenance and testing with the reduced API surface area. ## WebApi vs ServiceStack API Design Likewise WebApi promotes a similar C#-like RPC Api that WCF does: ```csharp public class ProductsController : ApiController { public IEnumerable GetAllProducts() { return products; } public Product GetProductById(int id) { var product = products.FirstOrDefault((p) => p.Id == id); if (product == null) { throw new HttpResponseException(HttpStatusCode.NotFound); } return product; } public Product GetProductByName(string categoryName) { var product = products.FirstOrDefault((p) => p.Name == categoryName); if (product == null) { throw new HttpResponseException(HttpStatusCode.NotFound); } return product; } public IEnumerable GetProductsByCategory(string category) { return products.Where(p => string.Equals(p.Category, category, StringComparison.OrdinalIgnoreCase)); } public IEnumerable GetProductsByPriceGreaterThan(decimal price) { return products.Where((p) => p.Price > price); } } ``` ### ServiceStack Message-Based API Design Whilst ServiceStack encourages you to retain a Message-based Design: ```csharp public class SearchProducts : IReturn> { public string Category { get; set; } public decimal? PriceGreaterThan { get; set; } } public class GetProduct : IReturn { public int? Id { get; set; } public string Name { get; set; } } public class ProductsService : Service { public object Get(SearchProducts request) { var ret = products.AsQueryable(); if (request.Category != null) ret = ret.Where(x => x.Category == request.Category); if (request.PriceGreaterThan.HasValue) ret = ret.Where(x => x.Price > request.PriceGreaterThan.Value); return ret.ToList(); } public Product Get(GetProduct request) { var product = request.Id.HasValue ? products.FirstOrDefault(x => x.Id == request.Id.Value) : products.FirstOrDefault(x => x.Name == request.Name); if (product == null) throw new HttpError(HttpStatusCode.NotFound, "Product does not exist"); return product; } } ``` Again capturing the essence of the Request in the Request DTO. The message-based design is also able to condense **5 separate RPC** WebAPI Services into **2 message-based** ServiceStack Services. ## Group by Call Semantics and Response Types It's grouped into 2 different services in this example based on **Call Semantics** and **Response Types**: Every property in each Request DTO has the same semantics that is for `SearchProducts` each property acts like a Filter (e.g. an AND) whilst in `GetProduct` it acts like a combinator (e.g. an OR). The Services also return `List` and `Product` return types which will require different handling in the call-sites of Typed APIs. In WCF / WebAPI (and other RPC services frameworks) whenever you have a client-specific requirement you would add a new Server signature on the controller that matches that request. In ServiceStack's message-based approach however you're instead encouraged to think about where this feature intuitively fits and whether you're able to enhance existing services. You should also be thinking about how you can support the client-specific requirement in a **generic way** so that the same service could benefit other future potential use-cases. ### Separate One and Many Services We can use the above context as a guide to design new Services. If we needed to design a Bookings System that needed an API to return **All Bookings** and a **Single Booking** we'd use a separate Services as they'd have different Response Types, e.g. `GetBooking` returns 1 booking whilst `GetBookings` returns many. ### Distinguish Service Operations vs Types There should be a clean split between your Operations (aka Request DTOs) which is unique per service and is used to capture the Services' request, and the DTO types they return. Request DTOs are usually actions so they're verbs, whilst DTO types are entities/data-containers so they're nouns. ### Returning naked collections ServiceStack can return naked collections that [don't require a ResponseStatus](/error-handling#error-response-types) property since if it doesn't exist the generic `ErrorResponse` DTO will be thrown and serialized on the client instead which frees you from having your Responses contain `ResponseStatus` property. ### Returning coarse-grained Response DTOs However since they offer better versionability that can later be extended to return more results without breaking existing clients we prefer specifying explicit Response DTOs for each Service, although this is entirely optional. So our preferred message-based would look similar to: ```csharp // Operations [Route("/bookings/{Id}")] public class GetBooking : IReturn { public int Id { get; set; } } public class GetBookingResponse { public Booking Result { get; set; } public ResponseStatus ResponseStatus { get; set; } // inject structured errors } [Route("/bookings/search")] public class SeachBookings : IReturn { public DateTime BookedAfter { get; set; } } public class SeachBookingsResponse { public List Results { get; set; } public ResponseStatus ResponseStatus { get; set; } // inject structured errors } // Types public class Booking { public int Id { get; set; } public int ShiftId { get; set; } public DateTime StartDate { get; set; } public DateTime EndDate { get; set; } public int Limit { get; set; } } ``` When they're not ambiguous we'll typiclly leave out specifying the **Verb** in `[Route]` definitions for **GET** Requests as its unnecessary. ### Using AutoQuery Where possible we'll also use [AutoQuery for Search Services](/autoquery/rdbms) which require dramatically less effort whilst offering a lot more functionality out-of-the-box. E.g. The Search Bookings Service with AutoQuery could adopt the same Customer Route and properties: ```csharp [Route("/bookings/search")] public class SeachBookings : QueryDb { public DateTime BookedAfter { get; set; } } ``` But no implementation is needed as AutoQuery automatically creates the optimal implementation. AutoQuery also supports [Implicit Conventions](/autoquery/rdbms#implicit-conventions) where you're able to filter by any of `Booking` table columns without any additional code or effort. ### Keep a consistent Nomenclature You should reserve the word **Get** on services which query on unique or Primary Keys fields, i.e. when a supplied value matches a field (e.g. Id) it only **Gets** 1 result. For "Search Services" that acts like a filter and returns multiple matching results which falls within a desired range we recommend using prefixing Services with the **Search** or **Find** verbs to signal the behavior of the Service. ### Self-describing Service Contracts Also try to be descriptive with each of your field names, these properties are part of your **public API** and should be self-describing as to what it does. E.g. By just looking at the Service Contract (e.g. Request DTO) we'd have no idea what a plain **Date** property means, as it could mean either **BookedAfter**, **BookedBefore** or **BookedOn** if it only returned bookings made on that Day. The benefit of this is now the call-sites of your [Typed .NET clients](/csharp-client) become easier to read: ```csharp Product product = client.Get(new GetProduct { Id = 1 }); var response = client.Get(new SearchBookings { BookedAfter = DateTime.Today }); ``` ## Service implementation [Filter Attributes](/filter-attributes) can be applied on either the **class** or **method** level, so when you need to secure all Operations within a given Service you can just annotate the top-level Service class with the `[Authenticate]`, e.g: ```csharp [Authenticate] public class BookingsService : Service { public object Get(GetBooking request) => ...; public object Get(SearchBookings request) => ...; } ``` ## Error Handling and Validation For info on how to add validation you either have the option to just [throw C# exceptions](/error-handling#throwing-c-exceptions) and apply your own customizations to them, in addition you also have the option to use the built-in [Declarative Validator](/declarative-validation) attributes on your Request DTO: ```csharp [ValidateIsAuthenticated] public class CreateBooking : IPost, IReturn { [ValidateNotNull] public DateTime? StartDate { get; set; } [ValidateGreatorThan(0)] public int ShiftId { get; set; } [ValidateGreatorThan(0)] public int Limit { get; set; } } ``` Or for more control you can use custom [Fluent Validation](/validation) validators. Validators are no-touch and invasive free meaning you can add them using a layered approach and maintain them without modifying the service implementation or DTO classes. Since they require an extra class We'd only use them on operations with side-effects e.g. **POST** or **PUT**, as **GET** requests tend to have minimal validation so throwing C# Exceptions typically requires less boilerplate. Here's an example of a validator you could have when creating a Booking: ```csharp public class CreateBookingValidator : AbstractValidator { public CreateBookingValidator() { RuleFor(r => r.StartDate).NotEmpty(); RuleFor(r => r.ShiftId).NotEmpty().GreaterThan(0); RuleFor(r => r.Limit).NotEmpty().GreaterThan(0); } } ``` Depending on the use-case instead of having separate `CreateBooking` and `UpdateBooking` DTOs you could re-use the same `StoreBooking` Request DTO to handle both operations. # Modular Startup Source: https://docs.servicestack.net/modular-startup ::: info For more information on the earlier Modular Startup in ServiceStack **v5.x** see our [Legacy Modular Startup](/modular-startup-legacy) docs ::: Taking advantage of C# 9 top level statements and .NET 6 [WebApplication Hosting Model](https://gist.github.com/davidfowl/0e0372c3c1d895c3ce195ba983b1e03d), ServiceStack templates by utilize both these features to simplify configuring your AppHost in a modular way. `Program.cs` becomes a script-like file since C# 9 top level statements are generating application entry point implicitly. ```csharp var builder = WebApplication.CreateBuilder(args); var app = builder.Build(); // Configure the HTTP request pipeline. if (!app.Environment.IsDevelopment()) { app.UseExceptionHandler("/Error"); app.UseHsts(); app.UseHttpsRedirection(); } app.UseServiceStack(new AppHost()) app.Run(); ``` The application `AppHost` hooks into startup using `HostingStartup` assembly attribute. In ServiceStack templates, this uses the file name prefix of `Configure.*.cs` to help identify these startup modules. All ServiceStack's features are loaded using .NET's `HostingStartup`, including ServiceStack's `AppHost` itself that's now being configured in [Configure.AppHost.cs](https://github.com/NetCoreTemplates/web/blob/master/MyApp/Configure.AppHost.cs), e.g: ```csharp [assembly: HostingStartup(typeof(MyApp.AppHost))] namespace MyApp; public class AppHost() : AppHostBase("MyApp"), IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { // Configure ASP.NET Core IOC Dependencies }); public override void Configure() { // Configure ServiceStack, Run custom logic after ASP.NET Core Startup SetConfig(new HostConfig { }); } } ``` The use of Modular Startup does not change the AppHost declaration, but enables the modular grouping of configuration concerns. Different features are encapsulated together allowing them to be more easily updated or replaced, e.g. each feature could be temporarily disabled by commenting out its assembly HostingStartup's attribute: ```csharp //[assembly: HostingStartup(typeof(MyApp.AppHost))] ``` ## Module composition using `mix` This has enabled ServiceStack Apps to be easily composed with the features developers need in mind. Either at project creation with servicestack.net/start page or after a project's creation where features can easily be added and removed using the command-line [mix tool](/mix-tool) where you can view all available mix gists that can be added to projects with: :::sh x mix ::: .NET 6's idiom is incorporated into the [mix gist config files](https://gist.github.com/gistlyn/9b32b03f207a191099137429051ebde8) to adopt its `HostingStartup` which is better able to load modular Startup configuration without assembly scanning. This is a standard ASP .NET Core feature that we can use to configure Mongo DB in any ASP .NET Core App with: :::sh x mix mongodb ::: Which adds the `mongodb` gist file contents to your ASP .NET Core Host project: ```csharp using Microsoft.AspNetCore.Hosting; using Microsoft.Extensions.DependencyInjection; using MongoDB.Driver; [assembly: HostingStartup(typeof(MyApp.ConfigureMongoDb))] namespace MyApp; public class ConfigureMongoDb : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { var mongoClient = new MongoClient(); IMongoDatabase mongoDatabase = mongoClient.GetDatabase("MyApp"); services.AddSingleton(mongoDatabase); }); } ``` As it's not a ServiceStack feature it can be used to configure ASP .NET Core Apps with any feature, e.g. we could also easily configure [Marten](https://martendb.io) in an ASP .NET Core App with: :::sh x mix marten ::: The benefit of this approach is entire modules of features can be configured in a single command, e.g. An empty ServiceStack App can be configured with MongoDB, ServiceStack Auth and a MongoDB Auth Repository with a single command: :::sh x mix auth auth-mongodb mongodb ::: Likewise, you can replace MongoDB with a completely different PostgreSQL RDBMS implementation by running: :::sh x mix auth auth-db postgres ::: ### Services and App Customizations Modular Startup configurations are flexible enough to encapsulate customizing ASP.NET Core's IOC and the built `WebApplication` by registering a `IStartupFilter` which is required by the Open API v3 Modular Configuration: :::sh x mix openapi3 ::: ```csharp [assembly: HostingStartup(typeof(MyApp.ConfigureOpenApi))] namespace MyApp; public class ConfigureOpenApi : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices((context, services) => { if (context.HostingEnvironment.IsDevelopment()) { services.AddEndpointsApiExplorer(); services.AddSwaggerGen(); services.AddServiceStackSwagger(); services.AddBasicAuth(); //services.AddJwtAuth(); services.AddTransient(); } }); public class StartupFilter : IStartupFilter { public Action Configure(Action next) => app => { app.UseSwagger(); app.UseSwaggerUI(); next(app); }; } } ``` ### ConfigureAppHost Looking deeper, we can see where we're plugins are able to configure ServiceStack via the `.ConfigureAppHost()` extension method to execute custom logic on `AppHost` Startup: ```csharp [assembly: HostingStartup(typeof(MyApp.ConfigureAutoQuery))] namespace MyApp; public class ConfigureAutoQuery : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { // Enable Audit History services.AddSingleton(c => new OrmLiteCrudEvents(c.GetRequiredService())); // For TodosService services.AddPlugin(new AutoQueryDataFeature()); // For Bookings https://docs.servicestack.net/autoquery-crud-bookings services.AddPlugin(new AutoQueryFeature { MaxLimit = 1000, //IncludeTotal = true, }); }) .ConfigureAppHost(appHost => { appHost.Resolve().InitSchema(); }); } ``` ### Customize AppHost at different Startup Lifecycles By default, any AppHost configuration is called before `AppHost.Configure()` is run, but to cater for all plugins, AppHost configurations can be registered at different stages within the AppHost's initialization: ```csharp public void Configure(IWebHostBuilder builder) => builder .ConfigureAppHost( beforeConfigure: appHost => /* fired before AppHost.Configure() */, afterConfigure: appHost => /* fired after AppHost.Configure() */, afterPluginsLoaded: appHost => /* fired after plugins are loaded */, afterAppHostInit: appHost => /* fired after AppHost has initialized */); ``` ### Removing Features The benefits of adopting a modular approach to AppHost configuration is the same as general organizational code structure which results in better decoupling and cohesion where it's easier to determine all the dependencies of a feature, easier to update, less chance of unintended side effects, easier to share standard configuration amongst multiple projects and easier to remove the feature entirely, either temporarily if needing to isolate & debug a runtime issue by: ```csharp // [assembly: HostingStartup(typeof(MyApp.ConfigureAuth))] ``` Or easier to permanently replace or remove features by either directly deleting the isolated `*.cs` source files or by undoing mixing in the feature using `mix -delete`, e.g: :::sh x mix -delete auth auth-db postgres ::: Which works similar to package managers where it removes all files contained within each mix gist. ::: info Please see the [Mix HowTo](https://gist.github.com/gistlyn/9b32b03f207a191099137429051ebde8#file-mix_howto-md) to find out how you can contribute your own gist mix features ::: ## Migrating to HostingStartup As we'll be using the new `HostingStartup` model going forward we recommend migrating your existing configuration to use them. To help with this you can refer to the [mix diff](https://github.com/ServiceStack/mix/commit/b56746622aa1879e3e6a8cbf835e634f05db30db) showing how each of the existing mix configurations were converted to the new model. As a concrete example, lets take a look at the steps used to migrate our Chinook example application [from NET5 using the previous `Startup : ModularStartup`, to .NET 6 `HostingStartup`](https://github.com/NetCoreApps/Chinook/commit/2758af9deae9c3aa910a27134f95167f7ec6e541). ### Step 1 Migrate your existing `ConfigureServices` and `Configure(IApplicationBuilder)` from `Startup : ModularStartup` to the top-level host builder in `Program.cs`. Eg ```csharp var builder = WebApplication.CreateBuilder(args); var app = builder.Build(); // Configure the HTTP request pipeline. if (!app.Environment.IsDevelopment()) { app.UseExceptionHandler("/Error"); // The default HSTS value is 30 days. // You may want to change this for production scenarios, see https://aka.ms/aspnetcore-hsts. app.UseHsts(); app.UseHttpsRedirection(); } app.Run(); ``` ### Step 2 Move your `AppHost` class to a new `Configure.AppHost.cs` file. ### Step 3 Implement `IHostingStartup` on your AppHost with automatic initialization. Eg: ```csharp public void Configure(IWebHostBuilder builder) { builder.ConfigureServices(services => { // Configure ASP.NET Core IOC Dependencies }); } ``` ### Step 4 Declare `assembly: HostingStartup` for your `AppHost` in the same `Configure.AppHost.cs`. Eg: ```csharp [assembly: HostingStartup(typeof(Chinook.AppHost))] ``` ### Step 5 Migrate each existing modular startup class that implements `IConfgiureServices` and/or `IConfigureApp` to use `IHostingStartup`. Eg: ```csharp // net5.0 modular startup using ServiceStack; namespace Chinook; public class ConfigureAutoQuery : IConfigureAppHost { public void Configure(IAppHost appHost) { appHost.Plugins.Add(new AutoQueryFeature { MaxLimit = 1000, IncludeTotal = true }); } } ``` ```csharp // net8.0 modular startup using IHostingStartup using Microsoft.AspNetCore.Hosting; using ServiceStack; [assembly: HostingStartup(typeof(Chinook.ConfigureAutoQuery))] namespace Chinook; public class ConfigureAutoQuery : IHostingStartup { public void Configure(IWebHostBuilder builder) => builder .ConfigureServices(services => { services.AddPlugin(new AutoQueryFeature { MaxLimit = 1000, IncludeTotal = true }); }); } ``` > Remembering also that infrastructure like your `Dockerfile` or host will likely need the runtimes/SDKs updated as well. # ServiceStack's .NET Core Utility Belt Source: https://docs.servicestack.net/dotnet-tool Our `x` and `app` dotnet tools are a versatile invaluable companion for all ServiceStack developers where it's jam packed with functionality to power a number of exciting scenarios where it serves as a [Sharp App](https://sharpscript.net/docs/sharp-apps) delivery platform where they can be run as a .NET Core Windows Desktop App with `app` or as a cross-platform Web App launcher using `web` and we've already how it's now a [`#Script` runner](https://sharpscript.net/docs/sharp-scripts) with `x run` and into a [Live `#Script` playground](https://sharpscript.net/docs/sharp-scripts#live-script-with-web-watch) with `x watch`. These tools contains all the functionality ServiceStack Developers or API consumers need that can be used [Create ServiceStack projects](/dotnet-new), run [Gist Desktop Apps](https://sharpscript.net/sharp-apps/gist-desktop-apps) or generate typed endpoints for consuming ServiceStack Services by either [Add/Update ServiceStack References](/add-servicestack-reference) or by generating [gRPC client proxies](/grpc#grpc-clients). ## Install To access available features, install with: :::sh dotnet tool install --global x ::: ### Update Or if you had a previous version installed, update with: :::sh dotnet tool update -g x ::: ::: info Both `x` and `app` have equivalent base functionality, whilst `app` has superset [Windows-only Desktop features](/netcore-windows-desktop) ::: ::: info To update and download Add ServiceStack Reference dtos without .NET see [npx get-dtos](/npx-get-dtos) ::: ## Usage Then run `x` without any arguments to view Usage: :::sh x ::: ```txt Usage: x new List available Project Templates x new