Notifications
Clear all

Ways to consolidate load-data.xml

1 Posts
2 Users
2 Likes
3,845 Views
1
Topic starter

We have a few different sets of data that are in our load-data.xml and would like to clean up the folders/process. The original thought was to use the multiple ant targets that we've made for testing individual sets and trying to call that from the load-data.xml. 

I also found this https://docs.onenetwork.com/devnet/latest/platform-user-s-guide/data-and-process-modeling-with-studio/load-data/module-specific-data

Which sounds like what I'm trying to achieve but am unsure of how the suggestion at the bottom works

  1. <LoadModuleData modulePrefix="ZBKS"/> - If you want to load only ZBKS module data.

If I call this with that module prefix is it going to use all inbound interfaces associated with that module or is that referencing a LoadModuleData.xml file that has the prefix defined?

So I guess my question has two parts. What would be the preferred method? If the first how can I call different ant targets within the load-data.xml? If the second is there a more specific example I can use to get started?

Topic Tags
1 Answer
1

LoadModuleData is for when you have a set of data which is always needed when a given module is present.

For example, let's say there is a set of options which makes more sense to store in a data model than in a string enumeration. In that case you can create module data to load whenever that module is used.

The way to create that is:

  1. Create a folder name "data" in the module.
  2. Put the associated data files in that "data" folder.
  3. Create a LoadModuleData.xml file in the module's "data" folder.
  4. Add a line to the dataset's LoadData.xml file calling LoadModuleData and referencing your module.

***

This may or may not solve your original question. You wrote that you are trying to clean up the folders in a dataset. 

Something I've seen other project teams do is create sub-folders in the dataset/data folder for different sets of related data. For example: dataset/data/dev, dataset/data/qa, dataset/data/psr, etc.

Then they create different load-data targets for each set of data. That way they can run "ant load-data-dev", "ant load-data-qa", etc.