Presentation is loading. Please wait.

Presentation is loading. Please wait.

Compilation Encapsulation Or: Why Every Component Should Just Do Its Damn Job.

Similar presentations


Presentation on theme: "Compilation Encapsulation Or: Why Every Component Should Just Do Its Damn Job."— Presentation transcript:

1 Compilation Encapsulation Or: Why Every Component Should Just Do Its Damn Job

2 “when a negative int literal (e.g. -5) appears in the code, should it be a single integer token whose value is -5 or as two tokens, minus and an integer whose value is 5?”

3 Well, in theory… We can write a lexer (maybe not with flex) with lookbehind, that makes sure the last token was neither a number nor a variable. (Or a function call, or a field reference. Pretty complicated lexer.)

4 But just because we can do it does that make it a good idea?

5 But what if we change the syntax? Professor Moriarty wants IC to be more like Matlab. He asks you to support support scalar operations on arrays of scalars. array – scalar = [a 1 -scalar, … a n -scalar] And suddenly new int[n] - 6 is valid…

6 Generic compiler structure Executable code exe Source text txt Semantic Representation Backend (synthesis) Compiler Frontend (analysis)

7 Executable code exe Source text txt Semantic Representation Backend (synthesis) Compiler Frontend (analysis) IC Program ic x86 executable exe Lexical Analysis Syntax Analysis Parsing ASTSymbol Table etc. Inter. Rep. (IR) Code Generation IC compiler

8 Encapsulation, what does it mean? It means each component needs to do its job, without regard for what the other components are doing. The tokenizer only cares about dividing the stream into tokens – Invalid characters – Keywords – Strings and comments

9 The parser only cares about building a structure out of tokens – Assumes a valid stream of tokens – Structural rules with no meaning The semantic checker is free to only worry about semantics – Assumes a valid AST – Actually worries about meaning

10 Fake Exam Question #1 Professor Xavier wants IC to be more like Python. He asks you to support array and string multiplication. "abc"*3  “abcabcabc” (new MyClass[5] * 2).length  10

11 But suppose… Suppose you decided to define your strings like so: \" { //move to state to handle content yybegin(STRING); in_string_literal = true; } { \"/{VALID_STR_POSTFIX} { //found the end of a string, finish. yybegin(YYINITIAL); in_string_literal = false; return new Token (sym.QUOTE,yyline + 1, string.toString()); } \" { throw new LexicalError(yyline+1); } //longest token only if invalid ahead

12 But suppose… Now we have to go back and fix the lexer, too. When, in reality, there was no real reason to perform that test: – There’s no case of something after the string the syntax won’t be able to cope with.

13 Back to the tokenizer not caring What needs changing? Lexer: – Nothing Syntax: – Nothing Semantic checks – Type check Code generation – Functionality of the operation

14 Fake Exam Question #2 We want IC to support binary numeric literals – With the following syntax: 0b010010101 (leading zeros after the binary signifier allowed) – With the same range restrictions

15 Solution #1 We’ll add a new lexer token type, BINNUMBER – 0b[01]+ And a new syntax rule for a BinNumber literal – Which, really, is only BINNUMBER And then check its range – Which is actually a lot easier than with decimals…

16 A short interlude: where does X go? Is property X lexical, syntactic or semantic? Two main deciding factors 1.Correctness: Is there enough data to make the call right now? 2.Laziness: What will be gained by doing this right now? Is this the place where it’s easiest to do?

17 Example A: Range of decimal literals Correctness: – In any two's complement implementation of integers, the bound is not going to be symmetric. – So we can’t make the call until we know if we have a positive or a negative number on our hand… Laziness: – Writing a lot of code that looks at the child expressions during syntax is usually a bad sign.

18 Example B: range of binary literals Correctness: – All the data is there the second we got the token. Laziness: – Postponing the check means a continued separation between binary and decimal literals – If we check right now, we can convert the value to a number and forget all about it

19 Back to Fake Question #2 So we can actually do it this way: We’ll add a new lexer rule – 0b[01]+ – We’ll also check the range here – And then! – return new Token (sym.INTEGER,yyline + 1, bin2decimal(yytext()));

20 Where does Y go? Place the following property: call to method foo() is a static call. Our guiding principle here is correctness: Lexer Syntax?

21 Where does Y go? Syntax breaks methods up into three types: 1.ClassName.foo() – definitely static 2.varname.foo() – definitely not So… correct? 3.foo() - ??? So… not syntax.

22 Fake Question #3 We want to allow type inference in IC var a = new A(); A b = a; C c = a; //type error

23 Q3: Lexer New token type VAR

24 Q3: Syntax We want an init expression whose type is VAR – Do we add VAR to types? – No, we treat it like void. How about AST representation? – We modify our LocalVariable class to keep “TBD” in its type

25 Q3: Semantics To determine the new variable’s type: – instead of computing its type field (which is TBD) – compute the type of the expression Put that value into the symbol table, and all else is business as usual!

26 Good luck on the exam!


Download ppt "Compilation Encapsulation Or: Why Every Component Should Just Do Its Damn Job."

Similar presentations


Ads by Google