The Unity SDK is C# and slots right into the typical Unity patterns for touch input.
Two contact types are provided:
Finger – Representing a single touch point, e.g, a finger
Glyph – Representing a tangible object
For each contact, you get ID, Position, and Phase. For Glyphs, you also get orientation and touched status, as in the system knows whether the object is being touched or not. There tunable parameters for the tracking system as well.
For event systems (e.g., menus, etc), BoardUIInputModule in provided in place of Unity's default InputSystemUIInputModule.
I consider Microsoft's one great, given its C# and .NET based SDK, instead of yet again C and C++.