Metadata and Infoset data walking refactored#55
Conversation
InfosetElement and related InfosetNode traits are now in runtime1.api InfosetOutputter now has methods which use InfosetElement and InfosetArray traits as the objects passed to the handler methods. Added Metadata, ElementMetadata, etc. (also runtime1.api package) which provide limited exposure to the RuntimeData and CompilerInfo information. Added MetadataHandler - which is walked by MetadataWalker which can be called from DataProcessor. Also moved InfosetOutputter to runtime1.api package as a way to avoid having JAPI and SAPI full proxies for now. However - for backward compatibility and javadoc/scaladoc reasons those may have to get put back later. But refactoring this with all this JAPI/SAPI redundancy... the architecture is basically locked/blocked from refactoring due to this. The InfosetNode types are supposed to be the API, the DINodes the implementation. This solves the issue of what classes should show through to SAPI and JAPI about the infoset nodes. It should be the InfosetNode types, not the DINode types. Furthermore, the InfosetNode types can have methods to access the needed runtime metadata information needed by walkers, InfosetOutputters, etc. These hide our infoset implementation and runtime metdata (RuntimeData classes) implementions. Moved things JAPI/SAPI need into runtime1.api package. So for now anyway, those packages aren't self-contained anymore. Still haven't figured out InfosetInputters yet Trickery to allow Java to work with InfosetOutputters that use an Impl class. BlobMethodImpl is now a class so must be extended first, InfosetOutputter is a trait so is implemented (second) in Java declarations. DAFFODIL-2832
Just for now to stay out of the way of the 3.6.0 work.
It was downcasting but didn't need to.
I couldn't get japi/sapi to publish their jars via publishM2 or publishLocal because scaladoc/javadoc were failing. I eventually learned that it is not a simple option to turn this stuff off in SBT. Powers that be seem to not want to make this easy. So I went through every problematic bit of scaladoc/javadoc and commented them out with line comments. Now japi/sapi jars will publish again. ARRRRRRRRRRRRRRRRGGGGGGGGGGGGGGGG
So the Metadata API can't have them.
So the Metadata API can't have them. NodeInfo.PrimType is n.g. for Java access. Using PrimTypeNode which is a class.
| case arr: DIArray => { | ||
| assertEquals(2, arr.length) | ||
| var a = arr(1).asInstanceOf[InfosetSimpleElement] | ||
| var a = arr.getOccurrence(1).asSimple |
There was a problem hiding this comment.
Revisit. Why doesn't apply(1) work here?
|
|
||
| enablePlugins(GenJavadocPlugin) | ||
| enablePlugins(PublishJavadocPlugin) | ||
| // enablePlugins(GenJavadocPlugin) |
There was a problem hiding this comment.
This was me trying desperately to turn off sbt doc processing on publishLocal. Failed.
It cannot be shut off.
| * XMLTextEscapeStyles for determining whether to wrap info in CDATA tags | ||
| */ | ||
| public enum XMLTextEscapeStyle { | ||
| public enum XMLTextEscapeStyle1 { |
There was a problem hiding this comment.
Temporarily eliminated much of the SAPI/JAPI abstractions that were making refactoring too hard.
| * | ||
| * Do not use the Compiler constructor to create a Compiler. Instead, use [[Daffodil#compiler()]]. | ||
| */ | ||
| ///** |
There was a problem hiding this comment.
Temporary - had to comment out all the javadoc here. It was preventing the japi jar from being created. I have no idea why it was failing when it worked before.
| * Walks the handler over the runtime metadata structures | ||
| * @param handler - the handler is called-back during the walk as each metadata structure is encountered. | ||
| */ | ||
| def save(output: WritableByteChannel): Unit = dp.save(output) |
There was a problem hiding this comment.
WalkMetadata (below) is new. It is the only actual change in this file.
| object Status extends Enumeration { | ||
| type Status = Value | ||
| val DONE, READY, VISITING = Value | ||
| } |
There was a problem hiding this comment.
I'm wondering if we should just define all these enums that have to show up on JAPI, in Java, and use them from Scala as Java Enums.
There was a problem hiding this comment.
Material that was here moved up to runtime1.api.Infoset.scala or down to InfosetImpl.scala
| lazy val nameToChildNodeLookup = new HashMap[NamedQName, ArrayBuffer[DINode]] | ||
|
|
||
| private lazy val nameToChildNodeLookup = | ||
| new java.util.HashMap[NamedQName, ArrayBuffer[DINode]] |
There was a problem hiding this comment.
Change to LinkedHashMap
There was a problem hiding this comment.
As with JAPI, this is mostly gone for now.
I actually want to think about generating these APIs, rather than maintaining them. The code base is much too hard to evolve with primary API functionality having to change in 3 places every time.
| */ | ||
|
|
||
| sbt.version=1.9.6 | ||
| sbt.version=1.7.3 |
|
Closed. Superceded by apache#1112 |
I did this in runtime1.api package.
Please review that package and the tests which no longer need the DINode classes, nor ElementRuntimeData.
There are still quite a few classes/enums etc. shining through the API like PrimTypeNode and SchemaFileLocation.
I have clobbered JAPI/SAPI for the time being because they were making the necessary refactoring impossible, so ignore them.
I also wasted hours - had a terrible time with javadoc and scaladoc failing when doing publishLocal/publishM2
resulting in no japi nor sapi jar. That's why I ended up commenting out most of the javadoc/scaladoc in JAPI/SAPI so that I could get on with things.
(You would think just telling sbt "don't create the doc" would be possible, but it seems not to be.)
I also backed out sbt to 1.7.3 because of wierdness.