Chapter 8. Advanced NAV Development Tools

 

"Beauty is more important in computing than anywhere else in technology because software is so complicated. Beauty is the ultimate defense against complexity."

 
 --David Gelernter
 

"Often when you think you're at the end of something, you're at the beginning of something else."

 
 --Fred Rogers

Because NAV is extremely flexible and suitable for addressing many problem types, there are a lot of choices for advanced NAV topics. We'll try to cover those which will be most helpful in the effort of putting together a complete implementation.

First, we will review the overall structure of NAV as an application software system, aiming for a basic understanding of the process flow of the system along with some of the utility functions built into the standard product. Before designing modifications for NAV, it is important to have a good understanding of the structural "style" of the software, so that our enhancements are designed for a better fit.

Second, we will review some special components of the NAV system that allow us to accomplish more at less cost. These resources include features such as XMLPorts and Web Services that we can build on, and which help us to use standard interface structures to connect with the world outside of NAV system. Fortunately, NAV has a good supply of such features.

NAV process flow

Primary data such as sales orders, purchase orders, production orders, and financial transactions flow through the NAV system as follows:

  • Initial Setup: The Essential Master data, reference data, and control and setup data is entered. Most of this preparation is done when the system (or a new application) is prepared for production use.
  • Transaction Entry: The transactions are entered into Documents and then transferred as part of a Posting sequence into a Journal table, or data may be entered directly into a Journal table. Data is preliminarily validated as it is entered with master and auxiliary data tables being referenced as appropriate. Entry can be via manual keying, an automated transaction generation process, or an import function that brings in transaction data from another system.
  • Validate: This provides for additional data validation processing of a set of one or more transactions, often in batches, prior to submitting it to Posting.
  • Post: This posts a Journal Batch, which includes completing transaction data validation, adding entries to one or more Ledgers, and perhaps also updating a register and document history.
  • Utilize: Access the data via Pages, Queries, and/or Reports including those that feed Web Services and other consumers of data. At this point, total flexibility exists. Whatever tools are appropriate for users' needs should be used, whether internal to NAV or external (external tools are often referred to as Business Intelligence (BI) tools). NAV's built-in capabilities for data manipulation, extraction, and presentation should not be overlooked.
  • Maintenance: The continued maintenance of all NAV data as appropriate.
NAV process flow

The preceding image provides a simplified picture of the flow of application data through a NAV system. Many of the transaction types have additional reporting, multiple ledgers to update, and even auxiliary processing. But this represents the basic data flow in NAV whenever a Journal and a Ledger table are involved.

When we enhance an existing NAV functional area (such as Jobs or Service Management), we may need to enhance related process flow elements by adding new fields to journals, ledgers, posted documents, and so on. It's always a good idea to add new fields rather than change the use of standard fields. It makes debugging, maintenance, and upgrading all easier.

When we create a new functional area, we will likely want to replicate the standard NAV process flow in some form for the new application's data. For example, for our WDTU application, we handle the entry of Playlists in the same fashion as a Journal is entered. A day's Playlists would be similar to a Journal Batch in another application. When a day's shows have been broadcast, the Playlist will be posted into the Radio Show Ledger as a permanent record of the broadcasts.

Initial setup and data preparation

Data must be maintained as new Master data becomes available, as various system operating parameters change, and so on. The standard approach for NAV data entry allows records to be entered that have just enough information to define the primary key fields, but not necessarily enough to support processing. This allows a great deal of flexibility in the timing and responsibility for entry and completeness of new data. This approach applies to both setup data entry and ongoing production transaction data entry.

This system design philosophy allows initial and incomplete data entry by one person, with validation and completion to be handled later by someone else. This works because often data comes into an organization on a piecemeal basis.

For example, a sales person might initialize a new customer entry with name, address, and phone number, entering just that data to which they have easy access. At this point, there is not enough information recorded to process orders for this new customer. At a later time, someone in the accounting department can set up posting groups, payment terms, and other control data that should not be controlled by the sales department. With this additional data, the new customer record is ready for production use.

The NAV data entry approach allows the system to be updated on an incremental basis as the data arrives, providing an operational flexibility that many systems lack. The other side of this flexibility is added responsibility for the users to ensure that partially updated information is completed on a timely fashion. For the customer who can't deal with that responsibility, it may be necessary to create special procedures or even system customizations which enforce the necessary discipline.

Transaction entry

Transactions are entered into a Journal table. Data is preliminarily validated as it is entered; master and auxiliary data tables are referenced as appropriate. Validations are based on the evaluation of the individual transaction data plus the related Master records and associated reference tables (for example, lookups being satisfied, application or system setup parameter constraints being met, and so on).

Testing and posting the Journal batch

Any additional validations that need to be done to ensure the integrity and completeness of the transaction data prior to being Posted, are done either in pre-Post routines or directly in the course of the Posting processes. The actual Posting of a Journal batch occurs when the transaction data has been completely validated. Depending on the specific application, when Journal transactions don't pass muster during this final validation stage, either the individual transaction is bypassed while acceptable transactions are Posted, or the entire Journal Batch is rejected until the identified problem is resolved.

The Posting process adds entries to one or more Ledgers, and sometimes to a document history table. When a Journal Entry is Posted to a Ledger, it becomes a part of the permanent accounting record. Most data cannot be changed or deleted once it resides in a Ledger.

Register tables may also be updated during Posting, recording the ID number ranges of ledger entries posted, when posted, and in what batches. This adds to the transparency of the NAV application system for audits and analysis.

In general, NAV follows the standard accounting practice of requiring Ledger revisions to be made by Posting reversing entries, rather than by deletion of problem entries. The overall result is that NAV is a very auditable system, a key requirement for a variety of government, legal, and certification requirements for information systems.

Utilizing and maintaining the data

The data in a NAV system can be accessed via Pages, Queries, and/or Reports, providing total flexibility. Whatever tools are available to the developer or the user, which are appropriate, should be used. There are some very good tools in NAV for data manipulation, extraction, and presentation. Among other things, these include the SIFT/Flowfield functionality, the pervasive filtering capability (including the ability to apply filters to subordinate data structures), and the Navigate function. NAV includes the ability to create page parts for graphing, with a wide variety of predefined chart page parts included as part of the standard distribution. We can also create our own chart parts using tools that were delivered with the system or available from Web blogs.

The NAV database design approach could be referred to as a "rational normalization". NAV isn't constrained by a rigid normalized data structure, where every data element appears only once. The NAV data structure is normalized so long as that principle doesn't get in the way of processing speed. Where processing speed or ease of use for the user is improved by duplicating data across tables, NAV does so. In addition, the duplication of master file data into transactions allows for one-time modification of that data when appropriate (such as a ship-to address in a Sales Order). That data duplication also often greatly simplifies data access for analysis.

Data maintenance

As with any database-oriented application software, ongoing maintenance of Master data, reference data, and setup and control data is required. In NAV, maintenance of data uses many of the same data preparation tools as were initially used to set up the system.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.14.133.138