OmniSciDB  471d68cefb
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Macros Groups Pages
generate_TableFunctionsFactory_init.Tokenize Class Reference

Public Member Functions

def __init__
 
def line
 
def tokens
 
def tokenize
 
def is_at_end
 
def current_token
 
def add_token
 
def lookahead
 
def advance
 
def peek
 
def can_token_be_double_char
 
def consume_double_char
 
def consume_single_char
 
def consume_whitespace
 
def consume_string
 
def consume_number
 
def consume_identifier
 
def is_token_identifier
 
def is_token_string
 
def is_digit
 
def is_alpha
 
def is_token_whitespace
 
def raise_tokenize_error
 

Public Attributes

 start
 
 curr
 

Private Attributes

 _line
 
 _tokens
 

Detailed Description

Definition at line 417 of file generate_TableFunctionsFactory_init.py.

Constructor & Destructor Documentation

Member Function Documentation

def generate_TableFunctionsFactory_init.Tokenize.add_token (   self,
  type 
)

Definition at line 456 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, generate_TableFunctionsFactory_init.Parser.line, foreign_storage::Interval< T >.start, JoinColumnIterator.start, com.omnisci.jdbc.OmniSciEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, generate_TableFunctionsFactory_init.Tokenize.start, and import_export::ImportStatus.start.

Referenced by generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), and generate_TableFunctionsFactory_init.Tokenize.consume_string().

457  def add_token(self, type):
458  lexeme = self.line[self.start:self.curr + 1]
459  self._tokens.append(Token(type, lexeme))

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.advance (   self)

Definition at line 465 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr.

Referenced by generate_TableFunctionsFactory_init.Parser.consume(), generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), generate_TableFunctionsFactory_init.Tokenize.consume_whitespace(), and generate_TableFunctionsFactory_init.Parser.expect().

466  def advance(self):
467  self.curr += 1

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.can_token_be_double_char (   self)

Definition at line 471 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

472  def can_token_be_double_char(self):
473  char = self.peek()
474  return char in ("-",)

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_double_char (   self)

Definition at line 475 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDagBuilder.cpp}::RANodeIterator.advance(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

476  def consume_double_char(self):
477  ahead = self.lookahead()
478  if ahead == ">":
479  self.advance()
480  self.add_token(Token.RARROW) # ->
481  self.advance()
482  else:
483  self.raise_tokenize_error()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_identifier (   self)
IDENTIFIER: [A-Za-z_][A-Za-z0-9_]*

Definition at line 542 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDagBuilder.cpp}::RANodeIterator.advance(), and generate_TableFunctionsFactory_init.Tokenize.lookahead().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

543  def consume_identifier(self):
544  """
545  IDENTIFIER: [A-Za-z_][A-Za-z0-9_]*
546  """
547  while True:
548  char = self.lookahead()
549  if char and char.isalnum() or char == "_":
550  self.advance()
551  else:
552  break
553  self.add_token(Token.IDENTIFIER)
554  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_number (   self)
NUMBER: [0-9]+

Definition at line 529 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDagBuilder.cpp}::RANodeIterator.advance(), and generate_TableFunctionsFactory_init.Tokenize.lookahead().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

530  def consume_number(self):
531  """
532  NUMBER: [0-9]+
533  """
534  while True:
535  char = self.lookahead()
536  if char and char.isdigit():
537  self.advance()
538  else:
539  break
540  self.add_token(Token.NUMBER)
541  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_single_char (   self)

Definition at line 484 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDagBuilder.cpp}::RANodeIterator.advance(), generate_TableFunctionsFactory_init.Tokenize.peek(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

485  def consume_single_char(self):
486  char = self.peek()
487  if char == "(":
488  self.add_token(Token.LPAR)
489  elif char == ")":
490  self.add_token(Token.RPAR)
491  elif char == "<":
492  self.add_token(Token.LESS)
493  elif char == ">":
494  self.add_token(Token.GREATER)
495  elif char == ",":
496  self.add_token(Token.COMMA)
497  elif char == "=":
498  self.add_token(Token.EQUAL)
499  elif char == "|":
500  self.add_token(Token.VBAR)
501  elif char == "!":
502  self.add_token(Token.BANG)
503  elif char == "[":
504  self.add_token(Token.LSQB)
505  elif char == "]":
506  self.add_token(Token.RSQB)
507  elif char == ":":
508  self.add_token(Token.COLON)
509  else:
510  self.raise_tokenize_error()
511  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_string (   self)
STRING: \".*?\"

Definition at line 515 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), anonymous_namespace{RelAlgDagBuilder.cpp}::RANodeIterator.advance(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), and generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

516  def consume_string(self):
517  """
518  STRING: \".*?\"
519  """
520  while True:
521  char = self.lookahead()
522  curr = self.peek()
523  if char == '"' and curr != '\\':
524  self.advance()
525  break
526  self.advance()
527  self.add_token(Token.STRING)
528  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.consume_whitespace (   self)

Definition at line 512 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.advance(), and anonymous_namespace{RelAlgDagBuilder.cpp}::RANodeIterator.advance().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

513  def consume_whitespace(self):
514  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.current_token (   self)

Definition at line 453 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, generate_TableFunctionsFactory_init.Parser.line, foreign_storage::Interval< T >.start, JoinColumnIterator.start, com.omnisci.jdbc.OmniSciEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, generate_TableFunctionsFactory_init.Tokenize.start, and import_export::ImportStatus.start.

Referenced by generate_TableFunctionsFactory_init.Parser.consume(), generate_TableFunctionsFactory_init.Parser.expect(), generate_TableFunctionsFactory_init.Parser.match(), and generate_TableFunctionsFactory_init.Parser.raise_parser_error().

454  def current_token(self):
455  return self.line[self.start:self.curr + 1]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_alpha (   self)

Definition at line 564 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

565  def is_alpha(self):
566  return self.peek().isalpha()

+ Here is the call graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_at_end (   self)

Definition at line 450 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, and generate_TableFunctionsFactory_init.Parser.line.

Referenced by generate_TableFunctionsFactory_init.Parser.parse_annotation(), generate_TableFunctionsFactory_init.Parser.parse_arg(), generate_TableFunctionsFactory_init.Parser.parse_args(), generate_TableFunctionsFactory_init.Parser.parse_templates(), generate_TableFunctionsFactory_init.Parser.parse_type(), generate_TableFunctionsFactory_init.Parser.parse_udtf(), and generate_TableFunctionsFactory_init.Tokenize.tokenize().

451  def is_at_end(self):
452  return len(self.line) == self.curr

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_digit (   self)

Definition at line 561 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

562  def is_digit(self):
563  return self.peek().isdigit()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_token_identifier (   self)

Definition at line 555 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

556  def is_token_identifier(self):
557  return self.peek().isalpha() or self.peek() == "_"

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_token_string (   self)

Definition at line 558 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

559  def is_token_string(self):
560  return self.peek() == '"'

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.is_token_whitespace (   self)

Definition at line 567 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.tokenize().

568  def is_token_whitespace(self):
569  return self.peek().isspace()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.line (   self)

Definition at line 426 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize._line.

Referenced by generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.current_token(), generate_TableFunctionsFactory_init.Tokenize.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), generate_TableFunctionsFactory_init.Tokenize.peek(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

427  def line(self):
428  return self._line

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.lookahead (   self)

Definition at line 460 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, and generate_TableFunctionsFactory_init.Parser.line.

Referenced by generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), and generate_TableFunctionsFactory_init.Tokenize.consume_string().

461  def lookahead(self):
462  if self.curr + 1 >= len(self.line):
463  return None
464  return self.line[self.curr + 1]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.peek (   self)

Definition at line 468 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, and generate_TableFunctionsFactory_init.Parser.line.

Referenced by generate_TableFunctionsFactory_init.Tokenize.can_token_be_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), generate_TableFunctionsFactory_init.Tokenize.is_alpha(), generate_TableFunctionsFactory_init.Tokenize.is_digit(), generate_TableFunctionsFactory_init.Tokenize.is_token_identifier(), generate_TableFunctionsFactory_init.Tokenize.is_token_string(), generate_TableFunctionsFactory_init.Tokenize.is_token_whitespace(), and generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error().

469  def peek(self):
470  return self.line[self.curr]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error (   self)

Definition at line 570 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.line(), generate_TableFunctionsFactory_init.UdtfNode.line, generate_TableFunctionsFactory_init.Parser.line, and generate_TableFunctionsFactory_init.Tokenize.peek().

Referenced by generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), and generate_TableFunctionsFactory_init.Tokenize.consume_single_char().

571  def raise_tokenize_error(self):
572  curr = self.curr
573  char = self.peek()
574  raise TokenizeException(
575  'Could not match char "%s" at pos %d on line\n %s' % (char, curr, self.line)
576  )
577 

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.tokenize (   self)

Definition at line 433 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize.can_token_be_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_double_char(), generate_TableFunctionsFactory_init.Tokenize.consume_identifier(), generate_TableFunctionsFactory_init.Tokenize.consume_number(), generate_TableFunctionsFactory_init.Tokenize.consume_single_char(), generate_TableFunctionsFactory_init.Tokenize.consume_string(), generate_TableFunctionsFactory_init.Tokenize.consume_whitespace(), generate_TableFunctionsFactory_init.Tokenize.curr, generate_TableFunctionsFactory_init.Tokenize.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.is_digit(), generate_TableFunctionsFactory_init.Tokenize.is_token_identifier(), generate_TableFunctionsFactory_init.Tokenize.is_token_string(), generate_TableFunctionsFactory_init.Tokenize.is_token_whitespace(), foreign_storage::Interval< T >.start, JoinColumnIterator.start, com.omnisci.jdbc.OmniSciEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, generate_TableFunctionsFactory_init.Tokenize.start, and import_export::ImportStatus.start.

434  def tokenize(self):
435  while not self.is_at_end():
436  self.start = self.curr
437 
438  if self.is_token_whitespace():
439  self.consume_whitespace()
440  elif self.is_digit():
441  self.consume_number()
442  elif self.is_token_string():
443  self.consume_string()
444  elif self.is_token_identifier():
445  self.consume_identifier()
446  elif self.can_token_be_double_char():
447  self.consume_double_char()
448  else:
449  self.consume_single_char()

+ Here is the call graph for this function:

def generate_TableFunctionsFactory_init.Tokenize.tokens (   self)

Definition at line 430 of file generate_TableFunctionsFactory_init.py.

References generate_TableFunctionsFactory_init.Tokenize._tokens.

Referenced by generate_TableFunctionsFactory_init.Parser.raise_parser_error().

431  def tokens(self):
432  return self._tokens

+ Here is the caller graph for this function:

Member Data Documentation

generate_TableFunctionsFactory_init.Tokenize._line
private

Definition at line 419 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Tokenize.line().

generate_TableFunctionsFactory_init.Tokenize._tokens
private

Definition at line 420 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Parser.consume(), generate_TableFunctionsFactory_init.Parser.current_token(), generate_TableFunctionsFactory_init.Parser.expect(), generate_TableFunctionsFactory_init.Parser.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.tokens(), and generate_TableFunctionsFactory_init.Parser.tokens().

generate_TableFunctionsFactory_init.Tokenize.curr

Definition at line 422 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.advance(), generate_TableFunctionsFactory_init.Tokenize.current_token(), generate_TableFunctionsFactory_init.Tokenize.is_at_end(), generate_TableFunctionsFactory_init.Tokenize.lookahead(), generate_TableFunctionsFactory_init.Tokenize.peek(), generate_TableFunctionsFactory_init.Tokenize.raise_tokenize_error(), and generate_TableFunctionsFactory_init.Tokenize.tokenize().

generate_TableFunctionsFactory_init.Tokenize.start

Definition at line 421 of file generate_TableFunctionsFactory_init.py.

Referenced by generate_TableFunctionsFactory_init.Tokenize.add_token(), generate_TableFunctionsFactory_init.Tokenize.current_token(), and generate_TableFunctionsFactory_init.Tokenize.tokenize().


The documentation for this class was generated from the following file: