TL;DR: Yeah, it's somewhat possible.
The somewhat official story goes like this: You write your desktop application in QtWidgets, and an embedded UI with QML. QtWidgets offers an extensive set of controls, and QML is good for rapid prototyping and easily embedding fancy stuff like animations, because of its hardware-rendered backends like OpenGL.
Porting a QtWidgets app to QML can be a huge task, and sometimes it might not be feasible at all, be it due to time constraints or due to some widgets missing their pendant in the QML world.
Thus embedding QtWidgets in QML might be an option, i.e. writing some controls in QML and some in QtWidgets, and maybe gradually rewrite widgets in QML (or leave some widgets as they are).
Here is a screenshot of a prototype:
... and here is the corresponding code (LGPLv2):
The idea to do that is not really new, however there did not seem to be code publicly available for this. The code is using a QQuickPaintedItem in QML to draw the widget with a QPainter; a caveat of using widgets in QML is that you often need to call update() by hand once the widget has changed.
Performance-wise, bigger widgets take several milliseconds to draw, and since the code always updates the whole widgets, it might not be possible to always get a frame rate of 60 frames per second, depending on the use case. For a rather static UI without animations it might still be good enough though.
An alternative to a QQuickPaintedItem would have been to just render to widget to an QImage and use it in a custom QQuickItem which renders to a QSGSimpleTextureNode or similar.
So far the code is just a prototype; feel free to let us know your suggestions, or just file a merge request!