OmniSciDB  085a039ca4
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Macros Groups Pages
generate_TableFunctionsFactory_init.Tokenize Class Reference

Public Member Functions

def __init__
 
def line
 
def tokens
 
def tokenize
 
def is_at_end
 
def current_token
 
def add_token
 
def lookahead
 
def advance
 
def peek
 
def can_token_be_double_char
 
def consume_double_char
 
def consume_single_char
 
def consume_whitespace
 
def consume_string
 
def consume_number
 
def consume_identifier
 
def is_token_identifier
 
def is_token_string
 
def is_digit
 
def is_alpha
 
def is_token_whitespace
 
def raise_tokenize_error
 

Public Attributes

 start
 
 curr
 

Private Attributes

 _line
 
 _tokens
 

Detailed Description

Definition at line 448 of file generate_TableFunctionsFactory_init.py.

Constructor & Destructor Documentation

Member Function Documentation

def generate_TableFunctionsFactory_init.Tokenize.add_token (   self,
  type 
)

Definition at line 487 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, generate_TableFunctionsFactory_init.Parser.line, foreign_storage::Interval< T >.start, JoinColumnIterator.start, ai.heavy.jdbc.HeavyAIEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, generate_TableFunctionsFactory_init.Tokenize.start, and import_export::ImportStatus.start.

Referenced by generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), and generate_TableFunctionsFactory_init.Tokenize.consume_string().

488  def add_token(self, type):
489  lexeme = self.line[self.start:self.curr + 1]
490  self._tokens.append(Token(type, lexeme))

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.advance (   self)

Definition at line 496 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr.

Referenced by generate_TableFunctionsFactory_init.Parser.consume(), generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), generate_TableFunctionsFactory_init.Tokenize.consume_whitespace(), and generate_TableFunctionsFactory_init.Parser.expect().

497  def advance(self):
498  self.curr += 1

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.can_token_be_double_char (   self)

Definition at line 502 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

503  def can_token_be_double_char(self):
504  char = self.peek()
505  return char in ("-",)

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_double_char (   self)

Definition at line 506 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDagBuilder.cpp}::RANodeIterator.advance(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

507  def consume_double_char(self):
508  ahead = self.lookahead()
509  if ahead == ">":
510  self.advance()
511  self.add_token(Token.RARROW) # ->
512  self.advance()
513  else:
514  self.raise_tokenize_error()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_identifier (   self)
IDENTIFIER: [A-Za-z_][A-Za-z0-9_]*

Definition at line 573 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDagBuilder.cpp}::RANodeIterator.advance(), and generate_TableFunctionsFactory_init.Tokenize.lookahead().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

574  def consume_identifier(self):
575  """
576  IDENTIFIER: [A-Za-z_][A-Za-z0-9_]*
577  """
578  while True:
579  char = self.lookahead()
580  if char and char.isalnum() or char == "_":
581  self.advance()
582  else:
583  break
584  self.add_token(Token.IDENTIFIER)
585  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_number (   self)
NUMBER: [0-9]+

Definition at line 560 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDagBuilder.cpp}::RANodeIterator.advance(), and generate_TableFunctionsFactory_init.Tokenize.lookahead().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

561  def consume_number(self):
562  """
563  NUMBER: [0-9]+
564  """
565  while True:
566  char = self.lookahead()
567  if char and char.isdigit():
568  self.advance()
569  else:
570  break
571  self.add_token(Token.NUMBER)
572  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_single_char (   self)

Definition at line 515 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDagBuilder.cpp}::RANodeIterator.advance(), generate_TableFunctionsFactory_init.Tokenize.peek(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

516  def consume_single_char(self):
517  char = self.peek()
518  if char == "(":
519  self.add_token(Token.LPAR)
520  elif char == ")":
521  self.add_token(Token.RPAR)
522  elif char == "<":
523  self.add_token(Token.LESS)
524  elif char == ">":
525  self.add_token(Token.GREATER)
526  elif char == ",":
527  self.add_token(Token.COMMA)
528  elif char == "=":
529  self.add_token(Token.EQUAL)
530  elif char == "|":
531  self.add_token(Token.VBAR)
532  elif char == "!":
533  self.add_token(Token.BANG)
534  elif char == "[":
535  self.add_token(Token.LSQB)
536  elif char == "]":
537  self.add_token(Token.RSQB)
538  elif char == ":":
539  self.add_token(Token.COLON)
540  else:
541  self.raise_tokenize_error()
542  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_string (   self)
STRING: \".*?\"

Definition at line 546 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDagBuilder.cpp}::RANodeIterator.advance(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), and generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

547  def consume_string(self):
548  """
549  STRING: \".*?\"
550  """
551  while True:
552  char = self.lookahead()
553  curr = self.peek()
554  if char == '"' and curr != '\\':
555  self.advance()
556  break
557  self.advance()
558  self.add_token(Token.STRING)
559  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_whitespace (   self)

Definition at line 543 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.advance(), and anonymous_namespace{RelAlgDagBuilder.cpp}::RANodeIterator.advance().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

544  def consume_whitespace(self):
545  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.current_token (   self)

Definition at line 484 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, generate_TableFunctionsFactory_init.Parser.line, foreign_storage::Interval< T >.start, JoinColumnIterator.start, ai.heavy.jdbc.HeavyAIEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, generate_TableFunctionsFactory_init.Tokenize.start, and import_export::ImportStatus.start.

Referenced by generate_TableFunctionsFactory_init.Parser.consume(), generate_TableFunctionsFactory_init.Parser.expect(), generate_TableFunctionsFactory_init.Parser.match(), and generate_TableFunctionsFactory_init.Parser.raise_parser_error().

485  def current_token(self):
486  return self.line[self.start:self.curr + 1]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_alpha (   self)

Definition at line 595 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

596  def is_alpha(self):
597  return self.peek().isalpha()

+ Here is the call graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_at_end (   self)

Definition at line 481 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, and generate_TableFunctionsFactory_init.Parser.line.

Referenced by generate_TableFunctionsFactory_init.Parser.parse_annotation(), generate_TableFunctionsFactory_init.Parser.parse_arg(), generate_TableFunctionsFactory_init.Parser.parse_args(), generate_TableFunctionsFactory_init.Parser.parse_templates(), generate_TableFunctionsFactory_init.Parser.parse_type(), generate_TableFunctionsFactory_init.Parser.parse_udtf(), and generate_TableFunctionsFactory_init.Tokenize.tokenize().

482  def is_at_end(self):
483  return len(self.line) == self.curr

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_digit (   self)

Definition at line 592 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

593  def is_digit(self):
594  return self.peek().isdigit()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_token_identifier (   self)

Definition at line 586 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

587  def is_token_identifier(self):
588  return self.peek().isalpha() or self.peek() == "_"

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_token_string (   self)

Definition at line 589 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

590  def is_token_string(self):
591  return self.peek() == '"'

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_token_whitespace (   self)

Definition at line 598 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

599  def is_token_whitespace(self):
600  return self.peek().isspace()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.line (   self)

Definition at line 457 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize._line.

Referenced by generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.current_token(), generate_TableFunctionsFactory_init.Tokenize.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), generate_TableFunctionsFactory_init.Tokenize.peek(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

458  def line(self):
459  return self._line

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.lookahead (   self)

Definition at line 491 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, and generate_TableFunctionsFactory_init.Parser.line.

Referenced by generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), and generate_TableFunctionsFactory_init.Parser.parse_arg().

492  def lookahead(self):
493  if self.curr + 1 >= len(self.line):
494  return None
495  return self.line[self.curr + 1]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.peek (   self)

Definition at line 499 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, and generate_TableFunctionsFactory_init.Parser.line.

Referenced by generate_TableFunctionsFactory_init.Tokenize.can_token_be_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), generate_TableFunctionsFactory_init.Tokenize.is_alpha(), generate_TableFunctionsFactory_init.Tokenize.is_digit(), generate_TableFunctionsFactory_init.Tokenize.is_token_identifier(), generate_TableFunctionsFactory_init.Tokenize.is_token_string(), generate_TableFunctionsFactory_init.Tokenize.is_token_whitespace(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

500  def peek(self):
501  return self.line[self.curr]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error (   self)

Definition at line 601 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, generate_TableFunctionsFactory_init.Parser.line, and generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), and generate_TableFunctionsFactory_init.Tokenize.consume_single_char().

602  def raise_tokenize_error(self):
603  curr = self.curr
604  char = self.peek()
605  raise TokenizeException(
606  'Could not match char "%s" at pos %d on line\n %s' % (char, curr, self.line)
607  )
608 

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.tokenize (   self)

Definition at line 464 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.can_token_be_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), generate_TableFunctionsFactory_init.Tokenize.consume_whitespace(), generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.is_digit(), generate_TableFunctionsFactory_init.Tokenize.is_token_identifier(), generate_TableFunctionsFactory_init.Tokenize.is_token_string(), generate_TableFunctionsFactory_init.Tokenize.is_token_whitespace(), foreign_storage::Interval< T >.start, JoinColumnIterator.start, ai.heavy.jdbc.HeavyAIEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, generate_TableFunctionsFactory_init.Tokenize.start, and import_export::ImportStatus.start.

465  def tokenize(self):
466  while not self.is_at_end():
467  self.start = self.curr
468 
469  if self.is_token_whitespace():
470  self.consume_whitespace()
471  elif self.is_digit():
472  self.consume_number()
473  elif self.is_token_string():
474  self.consume_string()
475  elif self.is_token_identifier():
476  self.consume_identifier()
477  elif self.can_token_be_double_char():
478  self.consume_double_char()
479  else:
480  self.consume_single_char()

+ Here is the call graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.tokens (   self)

Definition at line 461 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize._tokens.

Referenced by generate_TableFunctionsFactory_init.Parser.raise_parser_error().

462  def tokens(self):
463  return self._tokens

+ Here is the caller graph for this function:

Member Data Documentation

generate_TableFunctionsFactory_init.Tokenize._line
private

Definition at line 450 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Tokenize.line().

generate_TableFunctionsFactory_init.Tokenize._tokens
private

Definition at line 451 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Parser.consume(), generate_TableFunctionsFactory_init.Parser.current_token(), generate_TableFunctionsFactory_init.Parser.expect(), generate_TableFunctionsFactory_init.Parser.is_at_end(), generate_TableFunctionsFactory_init.Parser.lookahead(), generate_TableFunctionsFactory_init.Tokenize.tokens(), and generate_TableFunctionsFactory_init.Parser.tokens().

generate_TableFunctionsFactory_init.Tokenize.curr

Definition at line 453 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), generate_TableFunctionsFactory_init.Tokenize.current_token(), generate_TableFunctionsFactory_init.Tokenize.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), generate_TableFunctionsFactory_init.Tokenize.peek(), generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error(), and generate_TableFunctionsFactory_init.Tokenize.tokenize().

generate_TableFunctionsFactory_init.Tokenize.start

Definition at line 452 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.current_token(), and generate_TableFunctionsFactory_init.Tokenize.tokenize().


The documentation for this class was generated from the following file: