OmniSciDB  ca0c39ec8f
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Macros Groups Pages
generate_TableFunctionsFactory_init.Tokenize Class Reference

Public Member Functions

def __init__
 
def line
 
def tokens
 
def tokenize
 
def is_at_end
 
def current_token
 
def add_token
 
def lookahead
 
def advance
 
def peek
 
def can_token_be_double_char
 
def consume_double_char
 
def consume_single_char
 
def consume_whitespace
 
def consume_string
 
def consume_number
 
def consume_identifier
 
def is_token_identifier
 
def is_token_string
 
def is_digit
 
def is_alpha
 
def is_token_whitespace
 
def raise_tokenize_error
 

Public Attributes

 start
 
 curr
 

Private Attributes

 _line
 
 _tokens
 

Detailed Description

Definition at line 456 of file generate_TableFunctionsFactory_init.py.

Constructor & Destructor Documentation

Member Function Documentation

def generate_TableFunctionsFactory_init.Tokenize.add_token (   self,
  type 
)

Definition at line 495 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, generate_TableFunctionsFactory_init.Parser.line, foreign_storage::Interval< T >.start, JoinColumnIterator.start, ai.heavy.jdbc.HeavyAIEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, generate_TableFunctionsFactory_init.Tokenize.start, and import_export::ImportStatus.start.

Referenced by generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), and generate_TableFunctionsFactory_init.Tokenize.consume_string().

496  def add_token(self, type):
497  lexeme = self.line[self.start:self.curr + 1]
498  self._tokens.append(Token(type, lexeme))

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.advance (   self)

Definition at line 504 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr.

Referenced by generate_TableFunctionsFactory_init.Parser.consume(), generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), generate_TableFunctionsFactory_init.Tokenize.consume_whitespace(), and generate_TableFunctionsFactory_init.Parser.expect().

505  def advance(self):
506  self.curr += 1

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.can_token_be_double_char (   self)

Definition at line 510 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

511  def can_token_be_double_char(self):
512  char = self.peek()
513  return char in ("-",)

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_double_char (   self)

Definition at line 514 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

515  def consume_double_char(self):
516  ahead = self.lookahead()
517  if ahead == ">":
518  self.advance()
519  self.add_token(Token.RARROW) # ->
520  self.advance()
521  else:
522  self.raise_tokenize_error()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_identifier (   self)
IDENTIFIER: [A-Za-z_][A-Za-z0-9_]*

Definition at line 581 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), and generate_TableFunctionsFactory_init.Tokenize.lookahead().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

582  def consume_identifier(self):
583  """
584  IDENTIFIER: [A-Za-z_][A-Za-z0-9_]*
585  """
586  while True:
587  char = self.lookahead()
588  if char and char.isalnum() or char == "_":
589  self.advance()
590  else:
591  break
592  self.add_token(Token.IDENTIFIER)
593  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_number (   self)
NUMBER: [0-9]+

Definition at line 568 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), and generate_TableFunctionsFactory_init.Tokenize.lookahead().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

569  def consume_number(self):
570  """
571  NUMBER: [0-9]+
572  """
573  while True:
574  char = self.lookahead()
575  if char and char.isdigit():
576  self.advance()
577  else:
578  break
579  self.add_token(Token.NUMBER)
580  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_single_char (   self)

Definition at line 523 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), generate_TableFunctionsFactory_init.Tokenize.peek(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

524  def consume_single_char(self):
525  char = self.peek()
526  if char == "(":
527  self.add_token(Token.LPAR)
528  elif char == ")":
529  self.add_token(Token.RPAR)
530  elif char == "<":
531  self.add_token(Token.LESS)
532  elif char == ">":
533  self.add_token(Token.GREATER)
534  elif char == ",":
535  self.add_token(Token.COMMA)
536  elif char == "=":
537  self.add_token(Token.EQUAL)
538  elif char == "|":
539  self.add_token(Token.VBAR)
540  elif char == "!":
541  self.add_token(Token.BANG)
542  elif char == "[":
543  self.add_token(Token.LSQB)
544  elif char == "]":
545  self.add_token(Token.RSQB)
546  elif char == ":":
547  self.add_token(Token.COLON)
548  else:
549  self.raise_tokenize_error()
550  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_string (   self)
STRING: \".*?\"

Definition at line 554 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), and generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

555  def consume_string(self):
556  """
557  STRING: \".*?\"
558  """
559  while True:
560  char = self.lookahead()
561  curr = self.peek()
562  if char == '"' and curr != '\\':
563  self.advance()
564  break
565  self.advance()
566  self.add_token(Token.STRING)
567  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_whitespace (   self)

Definition at line 551 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.advance(), and anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

552  def consume_whitespace(self):
553  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.current_token (   self)

Definition at line 492 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, generate_TableFunctionsFactory_init.Parser.line, foreign_storage::Interval< T >.start, JoinColumnIterator.start, ai.heavy.jdbc.HeavyAIEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, generate_TableFunctionsFactory_init.Tokenize.start, and import_export::ImportStatus.start.

Referenced by generate_TableFunctionsFactory_init.Parser.consume(), generate_TableFunctionsFactory_init.Parser.expect(), generate_TableFunctionsFactory_init.Parser.match(), and generate_TableFunctionsFactory_init.Parser.raise_parser_error().

493  def current_token(self):
494  return self.line[self.start:self.curr + 1]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_alpha (   self)

Definition at line 603 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

604  def is_alpha(self):
605  return self.peek().isalpha()

+ Here is the call graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_at_end (   self)

Definition at line 489 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, and generate_TableFunctionsFactory_init.Parser.line.

Referenced by generate_TableFunctionsFactory_init.Parser.parse_annotation(), generate_TableFunctionsFactory_init.Parser.parse_arg(), generate_TableFunctionsFactory_init.Parser.parse_args(), generate_TableFunctionsFactory_init.Parser.parse_templates(), generate_TableFunctionsFactory_init.Parser.parse_type(), generate_TableFunctionsFactory_init.Parser.parse_udtf(), and generate_TableFunctionsFactory_init.Tokenize.tokenize().

490  def is_at_end(self):
491  return len(self.line) == self.curr

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_digit (   self)

Definition at line 600 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

601  def is_digit(self):
602  return self.peek().isdigit()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_token_identifier (   self)

Definition at line 594 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

595  def is_token_identifier(self):
596  return self.peek().isalpha() or self.peek() == "_"

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_token_string (   self)

Definition at line 597 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

598  def is_token_string(self):
599  return self.peek() == '"'

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_token_whitespace (   self)

Definition at line 606 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

607  def is_token_whitespace(self):
608  return self.peek().isspace()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.line (   self)

Definition at line 465 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize._line.

Referenced by generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.current_token(), generate_TableFunctionsFactory_init.Tokenize.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), generate_TableFunctionsFactory_init.Tokenize.peek(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

466  def line(self):
467  return self._line

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.lookahead (   self)

Definition at line 499 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, and generate_TableFunctionsFactory_init.Parser.line.

Referenced by generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), and generate_TableFunctionsFactory_init.Parser.parse_arg().

500  def lookahead(self):
501  if self.curr + 1 >= len(self.line):
502  return None
503  return self.line[self.curr + 1]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.peek (   self)

Definition at line 507 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, and generate_TableFunctionsFactory_init.Parser.line.

Referenced by generate_TableFunctionsFactory_init.Tokenize.can_token_be_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), generate_TableFunctionsFactory_init.Tokenize.is_alpha(), generate_TableFunctionsFactory_init.Tokenize.is_digit(), generate_TableFunctionsFactory_init.Tokenize.is_token_identifier(), generate_TableFunctionsFactory_init.Tokenize.is_token_string(), generate_TableFunctionsFactory_init.Tokenize.is_token_whitespace(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

508  def peek(self):
509  return self.line[self.curr]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error (   self)

Definition at line 609 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, generate_TableFunctionsFactory_init.Parser.line, and generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), and generate_TableFunctionsFactory_init.Tokenize.consume_single_char().

610  def raise_tokenize_error(self):
611  curr = self.curr
612  char = self.peek()
613  raise TokenizeException(
614  'Could not match char "%s" at pos %d on line\n %s' % (char, curr, self.line)
615  )
616 

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.tokenize (   self)

Definition at line 472 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.can_token_be_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), generate_TableFunctionsFactory_init.Tokenize.consume_whitespace(), generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.is_digit(), generate_TableFunctionsFactory_init.Tokenize.is_token_identifier(), generate_TableFunctionsFactory_init.Tokenize.is_token_string(), generate_TableFunctionsFactory_init.Tokenize.is_token_whitespace(), foreign_storage::Interval< T >.start, JoinColumnIterator.start, ai.heavy.jdbc.HeavyAIEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, generate_TableFunctionsFactory_init.Tokenize.start, and import_export::ImportStatus.start.

473  def tokenize(self):
474  while not self.is_at_end():
475  self.start = self.curr
476 
477  if self.is_token_whitespace():
478  self.consume_whitespace()
479  elif self.is_digit():
480  self.consume_number()
481  elif self.is_token_string():
482  self.consume_string()
483  elif self.is_token_identifier():
484  self.consume_identifier()
485  elif self.can_token_be_double_char():
486  self.consume_double_char()
487  else:
488  self.consume_single_char()

+ Here is the call graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.tokens (   self)

Definition at line 469 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize._tokens.

Referenced by generate_TableFunctionsFactory_init.Parser.raise_parser_error().

470  def tokens(self):
471  return self._tokens

+ Here is the caller graph for this function:

Member Data Documentation

generate_TableFunctionsFactory_init.Tokenize._line
private

Definition at line 458 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Tokenize.line().

generate_TableFunctionsFactory_init.Tokenize._tokens
private

Definition at line 459 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Parser.consume(), generate_TableFunctionsFactory_init.Parser.current_token(), generate_TableFunctionsFactory_init.Parser.expect(), generate_TableFunctionsFactory_init.Parser.is_at_end(), generate_TableFunctionsFactory_init.Parser.lookahead(), generate_TableFunctionsFactory_init.Tokenize.tokens(), and generate_TableFunctionsFactory_init.Parser.tokens().

generate_TableFunctionsFactory_init.Tokenize.curr

Definition at line 461 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), generate_TableFunctionsFactory_init.Tokenize.current_token(), generate_TableFunctionsFactory_init.Tokenize.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), generate_TableFunctionsFactory_init.Tokenize.peek(), generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error(), and generate_TableFunctionsFactory_init.Tokenize.tokenize().

generate_TableFunctionsFactory_init.Tokenize.start

Definition at line 460 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.current_token(), and generate_TableFunctionsFactory_init.Tokenize.tokenize().


The documentation for this class was generated from the following file: