I would like to define processes that takes cityGML (application schema of gml) datasets as (complex data) input, process the data and send back some xml based report about the input. But I am quite lost at the data binding / parsing / generator stuff that's part of this implementation. I was thinking about using citygml4j to parse the data but I have no idea how I can define a data binding for this. According to this:
You don't have to go all the way and create a binding/parser/generator
for citygml (although it would be great if you did and would contribute
it back :))
However, for your task you could use the
You will have to specify the parser/generator for the respective
in/-output in your process.
If you will be using the AbstractAnnotatedAlgorithm, your process could
look like this:
There are other tutorials in case you want to use another Algorithm
In the run method you could then use the citygml4j API to parse the
The in/-output will support the mime type text/xml and this will be what
you need to specify in the execute request later.
You could also specify a new mimetype/schema combination. Let us know,
if you need to know how to do this.
Thank you very much for your quick reply. I was already thinking along the lines you set out but thank you very much for clearing it fully up!
I agree that it would be cool to have this binding/parser/generator thing for cityGML but the problem is that some cityGML datasets can become very large (multiple gbs), so a *normal* parsing of such a file is problematic due to main memory shortage. CityGML4j solves this by being able to parse top level features one by one. I think its therefore incompatible with the binding/parser/generator stuff.
Thanks again and if you have some ideas about how this could be implemented I would love to hear about it.
Another possible solution could be the GenericFileDataBinding, I guess. The data would be downloaded and saved to a file, which you could then parse with the citygml API. Besides that, I was working on streaming-based parsing some time ago. That never made it beyond a prototypical stage, but I will have another look at it and report.