Hi David,
I can try and motivate the include concept first, and then open the can and look into that. You aren’t the first to ask why we need it, but so far I felt it was premature to talk about features. I don’t believe the debate about any form preprocessing is quite over here, but I’ll risk the flames J
It is fairly common with conditional compilation to define some predicate and guard compilation across your source code. It might be a simple one, like “this is a vhdl 2008 compile” .e.g., “`if defined vhdl-2008” or it might be a more complex expression related to specific tools or analysis goals, e.g., “`if defined style_rules1 && defined (acme_tool || any_m_c_s_tool) && ! defined vhdl-87”. A very common requirement is to define the macros in one place and them employ them across the code. This is rationale source code organization and this is fundamental, if abstract reason, rational source code organization. That one place is not necessarily a file. But doing it many places redundantly is…well… not rationale.
I am pretty clear in my mind that I would not want macros definition to persist across file boundaries. The idea is that you present a set of source files to a compiler and state defined in one file persists across the file boundary is messy. Yeessh! Some think it is hysterical that Verilog does that; it is really just historical and a burden they are stuck with. We will never do that.
There are 3 ways to get a consistent compilation of a body of source code. I can make the essential defines outside of the language and the preprocessing directives such as on a tool command line. I can put them in an include file and reference it across my code base. I can instantiate them in each source file. The first 2 allows me to configure my compilation by edits in one place. The 3
rd doesn’t and we shouldn’t end up there. Simple stuff can be done on a command line and will likely be a option. The standard can suggest that in an informativce note or ingore it. Any reasonable complexity will be captured in a file of preprocessing directives. A tool can be directed to include it at the front of any vhdl src it processes. Problem solved? In my opinion, that is not robust. The author of the source code has a composition and maintenance issue that include files solve.
Another use case that some might identify with is the setup of a specification for IP protection pragmas that will be applied to a body of code. The latter is a set of files and the decision to compose those files that way is driven by style and maintenance goals. Include files for these pragmas is a another good use case.
So the can of worms is something to look at. First, we are not putting this in the language, at least not the first class language. Second, we have freedom to abstract the reference to a source file to its expression in a file system.
We have a language that avoids src file references in favor of tools being responsible for consuming files of source code somehow and producing compiled design units in a library. IMHO that is not really changing.
The first class language does not reference files..sort of. When it must, we have created an abstraction and a tool mapping responsibility. The way we have addressed it so far is with a logical name and physical file mapping, where the mapping is defined outside the language. We do that for VHPI compiled code libraries and we will face it with a VHDL DPI, too. We can do that again or step closer to an define a stronger model for preprocessing, still without polluting the first class language. For sure tools will readily do it and it is likely to be a fairly interoperable solution, even if we shrink from defining it.
The source file aspects are handled by tools in front of the compilation step and artifacts persist after it. Compiled design units keep an association with files and line numbers for error handling, and it should not get worse with include files. If we are faithful about our processing requirements, all of this is handled prior to first class analysis ( lexical analysis of the tokens in VHDL grammar ).
I agree with separation of concerns. Tool user interface, tool flows, mixed language concerns, and version compatibility issues have reasonable separation in this standard. I don’t want to screw it up with this. I also feel that we must enable effective production design flows and solve problems in a standard way for the benefit of this user community.
I think your question is essential David. Is include an important requirement? I hope I have at least casually motivated it well.
Regards, John
From: owner-vhdl-200x@edaNOSPAM.org [[mailto:[mailto:owner-vhdl-200x@eda.org]]
mailto:owner-vhdl-200x@edaNOSPAM.org]
On Behalf Of David Smith
Sent: Thursday, August 25, 2011 8:16 AM
To: vhdl-200x@edaNOSPAM.org Subject: RE: [vhdl-200x] conditional compilation response
VHDL blissfully ignores the concept of files in the language. Doing file inclusion opens a can of worms here. Is there really a requirement to now add the concept of file inclusion into a language that has no concept of it?
Regards
David
David W. Smith
Synopsys Scientist
Synopsys, Inc.
Synopsys Technology Park
2025 NW Cornelius Pass Road
Hillsboro, OR 97124
Voice: 503.547.6467
Main: 503.547.6000
Cell: 503.560.5389
FAX: 503.547.6906
Email:
HDK
http://www.synopsys.com
Saber Accelerates Robust Design
Predictable. Repeatable. Reliable. Proven.
From: owner-vhdl-200x@edaNOSPAM.org [[mailto:[mailto:owner-vhdl-200x@eda.org]]
mailto:owner-vhdl-200x@edaNOSPAM.org]
On Behalf Of Dio Gratia
Sent: Thursday, August 25, 2011 8:03 AM
To: vhdl-200x@edaNOSPAM.org Subject: Re: [vhdl-200x] conditional compilation response
On 24/08/2011, at 4:17 PM, Shields, John wrote:
I will say that preprocessing itself is an implementation decision. It can be done in the context of the lexical analysis phase of the analyzer and often is. The real requirement is that it is not encumbered with first class language features and is compatible but orthogonal to the language syntax. Standardizing it for vhdl has real value for portable code composition. I only expressed it in my proposal as a preprocessor with a subset of C preprocessing features to make the high level ideas clear.
Orthogonal implies the name space for preprocessor directives is independent from VHDL object name space. Compatible implies which ever one comes first (preprocessor or lexical analysis) has no trouble distinguishing syntax. It's how you get conditional text to behave identically either internally or externally implemented.
Someone could look at the the gnatprep preprocessor
http://www.adacore.com/wp-content/files/auto_update/gnat-unw-docs/html/gnat_ugn_18.html
and infer John's starting point:
Features
- define and undefine of an identifier (i.e., a macro definition)
- simple constant expressions invloving such identifiers
- expression operators and, or, not, and defined
- if,elsif, else control structure
- a standard set of predefined identifiers that at least include language version
- source file inclusion
- conforms to the syntax of a standard tool directive defined in VHDL 2008 LRM
Considering some of the requirements and non-requirements mentioned by Jim Lewis and how the languages are related it's not too surprising the two can be similar.
There's a bit of an issue with design hierarchy which currently goes design file -> design unit, and conditional text sort of implies design file -> design description -> design unit. There can be implementation methods that would require intermediary files were token backtracking required and external preprocessing used. Conditional text uncouples the design unit from the design file.
Standard compliance could infer implementation decision above and beyond ease of implementation, forcing an internal implementation versus a preprocessor. It'd be nice to not put anyone on the hook. (And no, my own lexer doesn't use backtracking).
Anyway, I stopped and learned how to do preprocessing to the boundary between the lexer and parser in my analyzer after John's cryptic remarks. It's not as bad as I thought although I'm toying with the idea of a separate symbol table for it. There's precedent I already have a separate store for the implementation dependent mapping of design library simple names to library paths because I also have a tiny parser using the lexer for reading an init file. It gives me the ability to redirect the file input while preserving and restoring seek location and can use a separate input buffer. That would support source file inclusion.
I was originally convinced implementation would be a pain, but that's not really the case. I also use a derivative of the lexer in a DMSL tool and can see how easy it would be to write an external preprocessor as well.
--
JohnShields - 2011-09-21