Theoretical Aspects of Lexical Analysis/Exercise 10

From Wiki**3

< Theoretical Aspects of Lexical Analysis
Revision as of 12:16, 29 March 2009 by Root (talk | contribs) (New page: __NOTOC__ Compute the non-deterministic finite automaton (NFA) by using Thompson's algorithm. Compute the minimal deterministic finite automaton (DFA).<br/>The alphabet is Σ = { a, b }. I...)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Compute the non-deterministic finite automaton (NFA) by using Thompson's algorithm. Compute the minimal deterministic finite automaton (DFA).
The alphabet is Σ = { a, b }. Indicate the number of processing steps for the given input string.

  • G = { a*|b, ba*, b* }, input string = aababb

NFA

The following is the result of applying Thompson's algorithm. State 4 recognizes the first expression (token T1); state 12 recognizes token T2; and state 20 recognizes token T3.

<graph> digraph nfa {

    { node [shape=circle style=invis] s }
 rankdir=LR; ratio=0.5
 node [shape=doublecircle,fixedsize=true,width=0.2,fontsize=10]; 8 13 17
 node [shape=circle,fixedsize=true,width=0.2,fontsize=10];
 s -> 0
 0 -> 1 
 1 -> 2
 1 -> 6
 2 -> 3
 2 -> 5
 3 -> 4 [label="a",fontsize=10]
 4 -> 3
 4 -> 5
 5 -> 8
 6 -> 7 [label="b",fontsize=10]
 7 -> 8
 0 -> 9
 9 -> 10 [label="b",fontsize=10]
 10 -> 11
 10 -> 13
 11 -> 12 [label="a",fontsize=10]
 12 -> 11
 12 -> 13
 0 -> 14
 14 -> 15
 14 -> 17
 15 -> 16 [label="a",fontsize=10]
 16 -> 15
 16 -> 17
 fontsize=10

} </graph>

DFA

Determination table for the above NFA: Graphically, the DFA is represented as follows: The minimization tree is as follows. Note that before considering transition behavior, states are split according to the token they recognize. The tree expansion for non-splitting sets has been omitted for simplicity ("a" transitions for super-state {0, 1, 3}, and "a" and "b" transitions for super-state {1,3}).

Given the minimization tree, the final minimal DFA is as follows. Note that states 2 and 4 cannot be the same since they recognize different tokens.

Input Analysis