OmniSciDB  c1a53651b2
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Macros Groups Pages
generate_TableFunctionsFactory_init.Tokenize Class Reference

Public Member Functions

def __init__
 
def line
 
def tokens
 
def tokenize
 
def is_at_end
 
def current_token
 
def add_token
 
def lookahead
 
def advance
 
def peek
 
def can_token_be_double_char
 
def consume_double_char
 
def consume_single_char
 
def consume_whitespace
 
def consume_string
 
def consume_number
 
def consume_identifier
 
def is_token_identifier
 
def is_token_string
 
def is_digit
 
def is_alpha
 
def is_token_whitespace
 
def raise_tokenize_error
 

Public Attributes

 start
 
 curr
 

Private Attributes

 _line
 
 _tokens
 

Detailed Description

Definition at line 460 of file generate_TableFunctionsFactory_init.py.

Constructor & Destructor Documentation

Member Function Documentation

def generate_TableFunctionsFactory_init.Tokenize.add_token (   self,
  type 
)

Definition at line 499 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, generate_TableFunctionsFactory_init.Parser.line, foreign_storage::Interval< T >.start, JoinColumnIterator.start, ai.heavy.jdbc.HeavyAIEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, generate_TableFunctionsFactory_init.Tokenize.start, and import_export::ImportStatus.start.

Referenced by generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), and generate_TableFunctionsFactory_init.Tokenize.consume_string().

500  def add_token(self, type):
501  lexeme = self.line[self.start:self.curr + 1]
502  self._tokens.append(Token(type, lexeme))

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.advance (   self)

Definition at line 508 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr.

Referenced by generate_TableFunctionsFactory_init.Parser.consume(), generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), generate_TableFunctionsFactory_init.Tokenize.consume_whitespace(), and generate_TableFunctionsFactory_init.Parser.expect().

509  def advance(self):
510  self.curr += 1

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.can_token_be_double_char (   self)

Definition at line 514 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

515  def can_token_be_double_char(self):
516  char = self.peek()
517  return char in ("-",)

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_double_char (   self)

Definition at line 518 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

519  def consume_double_char(self):
520  ahead = self.lookahead()
521  if ahead == ">":
522  self.advance()
523  self.add_token(Token.RARROW) # ->
524  self.advance()
525  else:
526  self.raise_tokenize_error()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_identifier (   self)
IDENTIFIER: [A-Za-z_][A-Za-z0-9_]*

Definition at line 585 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), and generate_TableFunctionsFactory_init.Tokenize.lookahead().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

586  def consume_identifier(self):
587  """
588  IDENTIFIER: [A-Za-z_][A-Za-z0-9_]*
589  """
590  while True:
591  char = self.lookahead()
592  if char and char.isalnum() or char == "_":
593  self.advance()
594  else:
595  break
596  self.add_token(Token.IDENTIFIER)
597  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_number (   self)
NUMBER: [0-9]+

Definition at line 572 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), and generate_TableFunctionsFactory_init.Tokenize.lookahead().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

573  def consume_number(self):
574  """
575  NUMBER: [0-9]+
576  """
577  while True:
578  char = self.lookahead()
579  if char and char.isdigit():
580  self.advance()
581  else:
582  break
583  self.add_token(Token.NUMBER)
584  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_single_char (   self)

Definition at line 527 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), generate_TableFunctionsFactory_init.Tokenize.peek(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

528  def consume_single_char(self):
529  char = self.peek()
530  if char == "(":
531  self.add_token(Token.LPAR)
532  elif char == ")":
533  self.add_token(Token.RPAR)
534  elif char == "<":
535  self.add_token(Token.LESS)
536  elif char == ">":
537  self.add_token(Token.GREATER)
538  elif char == ",":
539  self.add_token(Token.COMMA)
540  elif char == "=":
541  self.add_token(Token.EQUAL)
542  elif char == "|":
543  self.add_token(Token.VBAR)
544  elif char == "!":
545  self.add_token(Token.BANG)
546  elif char == "[":
547  self.add_token(Token.LSQB)
548  elif char == "]":
549  self.add_token(Token.RSQB)
550  elif char == ":":
551  self.add_token(Token.COLON)
552  else:
553  self.raise_tokenize_error()
554  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_string (   self)
STRING: \".*?\"

Definition at line 558 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), and generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

559  def consume_string(self):
560  """
561  STRING: \".*?\"
562  """
563  while True:
564  char = self.lookahead()
565  curr = self.peek()
566  if char == '"' and curr != '\\':
567  self.advance()
568  break
569  self.advance()
570  self.add_token(Token.STRING)
571  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_whitespace (   self)

Definition at line 555 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.advance(), and anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

556  def consume_whitespace(self):
557  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.current_token (   self)

Definition at line 496 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, generate_TableFunctionsFactory_init.Parser.line, foreign_storage::Interval< T >.start, JoinColumnIterator.start, ai.heavy.jdbc.HeavyAIEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, generate_TableFunctionsFactory_init.Tokenize.start, and import_export::ImportStatus.start.

Referenced by generate_TableFunctionsFactory_init.Parser.consume(), generate_TableFunctionsFactory_init.Parser.expect(), generate_TableFunctionsFactory_init.Parser.match(), and generate_TableFunctionsFactory_init.Parser.raise_parser_error().

497  def current_token(self):
498  return self.line[self.start:self.curr + 1]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_alpha (   self)

Definition at line 607 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

608  def is_alpha(self):
609  return self.peek().isalpha()

+ Here is the call graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_at_end (   self)

Definition at line 493 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, and generate_TableFunctionsFactory_init.Parser.line.

Referenced by generate_TableFunctionsFactory_init.Parser.parse_annotation(), generate_TableFunctionsFactory_init.Parser.parse_arg(), generate_TableFunctionsFactory_init.Parser.parse_args(), generate_TableFunctionsFactory_init.Parser.parse_templates(), generate_TableFunctionsFactory_init.Parser.parse_type(), generate_TableFunctionsFactory_init.Parser.parse_udtf(), and generate_TableFunctionsFactory_init.Tokenize.tokenize().

494  def is_at_end(self):
495  return len(self.line) == self.curr

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_digit (   self)

Definition at line 604 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

605  def is_digit(self):
606  return self.peek().isdigit()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_token_identifier (   self)

Definition at line 598 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

599  def is_token_identifier(self):
600  return self.peek().isalpha() or self.peek() == "_"

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_token_string (   self)

Definition at line 601 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

602  def is_token_string(self):
603  return self.peek() == '"'

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_token_whitespace (   self)

Definition at line 610 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

611  def is_token_whitespace(self):
612  return self.peek().isspace()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.line (   self)

Definition at line 469 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize._line.

Referenced by generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.current_token(), generate_TableFunctionsFactory_init.Tokenize.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), generate_TableFunctionsFactory_init.Tokenize.peek(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

470  def line(self):
471  return self._line

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.lookahead (   self)

Definition at line 503 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, and generate_TableFunctionsFactory_init.Parser.line.

Referenced by generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), and generate_TableFunctionsFactory_init.Parser.parse_arg().

504  def lookahead(self):
505  if self.curr + 1 >= len(self.line):
506  return None
507  return self.line[self.curr + 1]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.peek (   self)

Definition at line 511 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, and generate_TableFunctionsFactory_init.Parser.line.

Referenced by generate_TableFunctionsFactory_init.Tokenize.can_token_be_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), generate_TableFunctionsFactory_init.Tokenize.is_alpha(), generate_TableFunctionsFactory_init.Tokenize.is_digit(), generate_TableFunctionsFactory_init.Tokenize.is_token_identifier(), generate_TableFunctionsFactory_init.Tokenize.is_token_string(), generate_TableFunctionsFactory_init.Tokenize.is_token_whitespace(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

512  def peek(self):
513  return self.line[self.curr]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error (   self)

Definition at line 613 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, generate_TableFunctionsFactory_init.Parser.line, and generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), and generate_TableFunctionsFactory_init.Tokenize.consume_single_char().

614  def raise_tokenize_error(self):
615  curr = self.curr
616  char = self.peek()
617  raise TokenizeException(
618  'Could not match char "%s" at pos %d on line\n %s' % (char, curr, self.line)
619  )
620 

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.tokenize (   self)

Definition at line 476 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.can_token_be_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), generate_TableFunctionsFactory_init.Tokenize.consume_whitespace(), generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.is_digit(), generate_TableFunctionsFactory_init.Tokenize.is_token_identifier(), generate_TableFunctionsFactory_init.Tokenize.is_token_string(), generate_TableFunctionsFactory_init.Tokenize.is_token_whitespace(), foreign_storage::Interval< T >.start, JoinColumnIterator.start, ai.heavy.jdbc.HeavyAIEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, generate_TableFunctionsFactory_init.Tokenize.start, and import_export::ImportStatus.start.

477  def tokenize(self):
478  while not self.is_at_end():
479  self.start = self.curr
480 
481  if self.is_token_whitespace():
482  self.consume_whitespace()
483  elif self.is_digit():
484  self.consume_number()
485  elif self.is_token_string():
486  self.consume_string()
487  elif self.is_token_identifier():
488  self.consume_identifier()
489  elif self.can_token_be_double_char():
490  self.consume_double_char()
491  else:
492  self.consume_single_char()

+ Here is the call graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.tokens (   self)

Definition at line 473 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize._tokens.

Referenced by generate_TableFunctionsFactory_init.Parser.raise_parser_error().

474  def tokens(self):
475  return self._tokens

+ Here is the caller graph for this function:

Member Data Documentation

generate_TableFunctionsFactory_init.Tokenize._line
private

Definition at line 462 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Tokenize.line().

generate_TableFunctionsFactory_init.Tokenize._tokens
private

Definition at line 463 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Parser.consume(), generate_TableFunctionsFactory_init.Parser.current_token(), generate_TableFunctionsFactory_init.Parser.expect(), generate_TableFunctionsFactory_init.Parser.is_at_end(), generate_TableFunctionsFactory_init.Parser.lookahead(), generate_TableFunctionsFactory_init.Tokenize.tokens(), and generate_TableFunctionsFactory_init.Parser.tokens().

generate_TableFunctionsFactory_init.Tokenize.curr

Definition at line 465 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), generate_TableFunctionsFactory_init.Tokenize.current_token(), generate_TableFunctionsFactory_init.Tokenize.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), generate_TableFunctionsFactory_init.Tokenize.peek(), generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error(), and generate_TableFunctionsFactory_init.Tokenize.tokenize().

generate_TableFunctionsFactory_init.Tokenize.start

Definition at line 464 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.current_token(), and generate_TableFunctionsFactory_init.Tokenize.tokenize().


The documentation for this class was generated from the following file: