Hi Doron, I did not mean to suggest that initial values should be made non-deterministic, random, or tool dependant. I agree that allowing initial values to depend on specific (and deterministic) tool implementation would be a worst solution. I apologize for any confusion I may have created. Here's what I recommend: Since the real hardware has uncertainty regarding values at time 0, users should be discouraged from attempting to trigger active clock edges or sample values at time 0; that is a methodology issue, not a language issue. Nonetheless, it indicates that sampling values at time 0 is, at best, a corner case. And, if that's the case, we should not attempt to design new semantics to make assertions behave any more deterministically than the rest of the language. By that statement I mean explicitly that the language should not define the order in which events are processed at time 0, which is consistent with the existing scheduling semantics - the language does not define the order in which events are processes at any particular time. Typically, nets attain their initial (or steady state) value at time 0 only after processing the events due to initial statements and propagation of initial values through combinational logic. Included among those events will be the events that trigger clock transitions. It appears that in order to guarantee sampling of those initial net values, the implementation needs to impose an ordering in which clock events at time 0 are processed last - after combinational logic has settled. It is this additional determinism that I do not believe should be added to the language: whether a clock transition at time 0 occurs before or after some other combinational event is currently not defined (and possibly implementation dependent). So users should not rely on any particular ordering. The existing semantics in the LRM state that static variables initialized as part of their declaration occur before initial or always blocks are started. Even though the LRM does not explicitly say so, the same ought to be true for the default (or un-initialized) value of all signals: X for four-state logic and 0 for two-state logic. Hence, when the Preponed region at time 0 is processed, only static variables initialized in their declaration will have been initialized to anything other than their un-initilized values. The above rules do, in principle, seem to guarantee that a clock signal that is triggered at time 0 (presumably from an initial statement) may sample the initial value of any static variable initialized in its declaration. But, they do not guarantee the sampled value of any other signal in the system. Users may attempt to work around this uncertainty be delaying the initial clock transition: either through an actual delay or by playing with the various scheduling regions (##0 or NBA's). Both of these stratagems are problematic for formal tools. Hence, users should avoid sampling values at time 0, and if they do, they should be aware that they may be relying on an unrealistic simulation model that may not reflect the actual system behavior. Arturo -----Original Message----- From: Doron Bustan [mailto:dbustan@freescale.com] Sent: Tuesday, January 30, 2007 6:04 AM To: Arturo Salz Cc: Korchemny, Dmitry; Rich, Dave; john.havlicek@freescale.com; Eduard.Cerny@synopsys.COM; sv-ac@eda-stds.org Subject: Re: [sv-ac] reminder to vote on mantis 1550 Hi Arturo, I have a question regarding the following paragraph: >I don't believe it's possible for a simulator to accurately model the >effect of powering up a system, which is what we usually refer to as >time 0 setup or initialization. By its very nature, this is an analog >process that cannot take place in zero time. I believe that attempting >to force a deterministic simulation model, which is not possible in the >real system, may conceal actual bugs and lull engineers into a false >sense of security. In order to avoid such situations, a simulator should >be conservative - without being overtly pessimistic. In this case, >conservative should cause designers to not rely on pre-initialized nets >- consider that in reality even the power supplies may be changing at >time 0. > > When you say that the initial value should be non-deterministic, do you mean they sholud be selected randomly (e.g. based on the seed)? Or do you mean that the standard should not specify the values and they should be tool depended? I think that a tool depended solution is worst than defining deterministic values, because unless you run the simulation on several different simulators, you still see only one non-realistic value, only now you do not have any control on it. Doron -- This message has been scanned for viruses and dangerous content by MailScanner, and is believed to be clean.Received on Tue Jan 30 11:34:51 2007
This archive was generated by hypermail 2.1.8 : Tue Jan 30 2007 - 11:35:06 PST