I have recently brushed up on some basic Agile training, it mostly applies to teams and, at least the literature, is focused around software development. I am not a software developer, and I work on my own for a large part of the time. That makes it sound like the training was a waste of time, but it wasn’t.
A common request is to maintain traceability into the supply chain. This has always proved to be problematic for a number of reasons, some technical and some political. I am offering here a potential solution.
We have, in many industries, now progressed beyond the clone-and-own approach to reuse, and some support for genuine reuse of requirements and other design data has become expected.
The underlying principle that I have used is that the organization owning the data is the only organization that will be afforded editing privileges. This applies whether the organizations in question are different legal entities, working to a contract, departments within a single business, or individuals with clear responsibility boundaries. This allows us to manage the access rights based on the owning organization.
DNG currently has the limitation that anyone who has read access to any of a Project Area (PA), has read access to the entire Project Area. This can be managed by use of smaller PAs, tied together with a Global Configuration (GC).
Another issue has historically been that of access and firewalls. It has been rare for organizations to let suppliers inside their firewall. Cloud hosting eases this to some extent, and if your immediate reaction to that is one of horror that your data could be trusted to the cloud, then please look a little deeper. It can be as secure as working with a server in the basement of one of your many sites.
I recently had a need to connect DOORS Next Generation to DOORS 9 with OSLC. The environment that I had was complete with a configured set up of DOORS Web Access, which is necessary along with DOORS 9 and DNG. The set up for OSLC is actually fairly simple, but there are a couple of gothcha’s so I will list the key elements here, in the hope that it will help anybody going through the same process.
First add a Consumer (Outbound) to the JTS admin list. Make sure you add this to the JTS admin, not to the RM admin. Set a custom key if you like, and make sure you know the consumer secret. Set this as trusted, and Register.
I have been setting up a demo project in DOORS Next Generation. It starts with a Vision Document and then moves on to a number of Context diagrams. There will be more as the project develops, but that is as far as it goes for this article.
First I started with a clean DNG project and created all the artifact types that I thought I would need.
Creating the text types first, and then the module types. I have a general use Heading type, a general Information type and a general Diagram type. Everything else is more specifically targeted.
As a unit of measure, a headful is not consistent. Not consistent between individuals, not consistent in any one individual from day to day. I will explain what I mean by the term, how it varies, and how to stretch it.
Firstly the definition. The Collins English Dictionary defines it as ‘the amount a head or brain will hold’. I use it specifically with reference to the understanding of a problem.
So how does it vary? When presented with a new project, I try to understand the big picture, the whole project. There is a level of abstraction at which that is possible. As time goes on, and I become familiar with the problem and the previous headful of information becomes compacted, making space for more detail. This is closely aligned to the process of turning data into knowledge and understanding. The initial headful for any given problem also depends on the other things already in my head; if I can closely align the new data with previously compacted data, then I can take in more at the first go. Each individual has a different capacity for a given problem space.
Note: my mental model of my mental model has diggers and dumper trucks and rollers and rotavators. While digging for the wanted information, there will often be surprising additional facts uncovered leading to interesting juxtapositions. I think a beautifully clean warehouse of a mind, while more efficient at retrieving things, might be rather dull and predictable.
If this is such a loosely defined quantity, then how is it helpful? First there is the appreciation that someone new to a project cannot understand all the detail across all the project areas on day one. We need to abstract the whole to a level that is comprehensible, and add detail once that framework is in place. Second, by understanding that this headful is not fixed through time, we can look at ways to supplement it, aid the compaction, and improve the ability to manipulate the data.
Tools. Pause for a moment and note what comes into your mind when I say tools. Tools come in many shapes and forms. The relevant ones here are:
- Thinking tools – techniques and shortcuts
- Data capture tools – databases, data storage
- Data manipulation tools – modelling tools, more sophisticated search tools on the captured data
Data storage can actually be a bad thing for building mental models. Once something is written down and documented, there is a tendency to forget it. Many people use lists in some form to clear the clutter from their heads and make space for creative thought. Data storage is, however, essential for the sharing of information across teams. We have to ensure we write and remember, not write and forget.
Thinking tools allow us to fit the data to a pattern and gain understanding. This might be by asking certain questions, or by making certain associations. They involve manipulating the data in the headful, shaking it up, and allowing it to settle back into less space.
Data manipulation tools are where the real increase in the unit of the headful can be seen. This allows us to use the thinking tools across a wider data set, to find the next piece of the puzzle that will make sense. These tools also allow us to share and merge our headful of data with the other team members’ headfuls.
Many industries rely heavily on a large number of standards, a selection of which are used on each project. There is a perennial problem of managing these so that they are visible within the engineering data environment along with other design artifacts. I wrote previously about some approaches to creating and maintaining traceability to the […]
How big is a project… or more specifically, how big should a DOORS Next Generation project be. Where do we draw the project boundaries in DNG. The answer, of course, is ‘it depends’, the trick is to know on what it depends. At one extreme, you run a single project and have everything in there. […]
It was a foggy day. Not so foggy that you needed to stay home, but one of those days where visibility was kind of OK. It was quite bright, but traveling at speed you really needed lights on. Ten years ago this would not have been a problem. Most people see the need for lights […]
I am going to take a look at two aspects of user access controls in DOORS Next Generation that frequently come up. The first is how to restrict read access for some team members to some parts of the project area. The second is how to limit who can make changes to individual attribute values. […]
I recently came across a project working Agile with Rational Team Concert (RTC). This is great, they have all the tracking tools they need to monitor project velocity, and to manage which stories will go into which sprint. Job done. Short blog entry here… Oh, just one thing. Who is ever in a project that […]