OmniSciDB  a5dc49c757
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Friends Macros Groups Pages
TableFunctionsFactory_parser.Tokenize Class Reference

Public Member Functions

def __init__
 
def line
 
def tokens
 
def tokenize
 
def is_at_end
 
def current_token
 
def add_token
 
def lookahead
 
def advance
 
def peek
 
def can_token_be_double_char
 
def consume_double_char
 
def consume_single_char
 
def consume_whitespace
 
def consume_string
 
def consume_number
 
def consume_identifier_or_boolean
 
def is_token_identifier_or_boolean
 
def is_token_string
 
def is_number
 
def is_alpha
 
def is_token_whitespace
 
def raise_tokenize_error
 

Public Attributes

 start
 
 curr
 

Private Attributes

 _line
 
 _tokens
 

Detailed Description

Definition at line 79 of file TableFunctionsFactory_parser.py.

Constructor & Destructor Documentation

Member Function Documentation

def TableFunctionsFactory_parser.Tokenize.add_token (   self,
  type 
)

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.advance (   self)

Definition at line 127 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.curr.

Referenced by TableFunctionsFactory_parser.Parser.consume(), TableFunctionsFactory_parser.Tokenize.consume_double_char(), TableFunctionsFactory_parser.Tokenize.consume_identifier_or_boolean(), TableFunctionsFactory_parser.Tokenize.consume_number(), TableFunctionsFactory_parser.Tokenize.consume_single_char(), TableFunctionsFactory_parser.Tokenize.consume_string(), TableFunctionsFactory_parser.Tokenize.consume_whitespace(), and TableFunctionsFactory_parser.Parser.expect().

128  def advance(self):
129  self.curr += 1

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.can_token_be_double_char (   self)

Definition at line 133 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.peek().

Referenced by TableFunctionsFactory_parser.Tokenize.tokenize().

134  def can_token_be_double_char(self):
135  char = self.peek()
136  return char in ("-",)

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.consume_double_char (   self)

Definition at line 137 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.add_token(), TableFunctionsFactory_parser.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), TableFunctionsFactory_parser.Tokenize.lookahead(), and TableFunctionsFactory_parser.Tokenize.raise_tokenize_error().

Referenced by TableFunctionsFactory_parser.Tokenize.tokenize().

138  def consume_double_char(self):
139  ahead = self.lookahead()
140  if ahead == ">":
141  self.advance()
142  self.add_token(Token.RARROW) # ->
143  self.advance()
144  else:
145  self.raise_tokenize_error()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.consume_identifier_or_boolean (   self)
IDENTIFIER: [A-Za-z_][A-Za-z0-9_]*

Definition at line 211 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.add_token(), TableFunctionsFactory_parser.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), TableFunctionsFactory_parser.Tokenize.current_token(), and TableFunctionsFactory_parser.Tokenize.lookahead().

Referenced by TableFunctionsFactory_parser.Tokenize.tokenize().

213  """
214  IDENTIFIER: [A-Za-z_][A-Za-z0-9_]*
215  """
216  while True:
217  char = self.lookahead()
218  if char and char.isalnum() or char == "_":
219  self.advance()
220  else:
221  break
222  if self.current_token().lower() in ("true", "false"):
223  self.add_token(Token.BOOLEAN)
224  else:
225  self.add_token(Token.IDENTIFIER)
226  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.consume_number (   self)
NUMBER: [-]([0-9]*[.])?[0-9]+

Definition at line 191 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.add_token(), TableFunctionsFactory_parser.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), and TableFunctionsFactory_parser.Tokenize.lookahead().

Referenced by TableFunctionsFactory_parser.Tokenize.tokenize().

192  def consume_number(self):
193  """
194  NUMBER: [-]([0-9]*[.])?[0-9]+
195  """
196  found_dot = False
197  while True:
198  char = self.lookahead()
199  if char:
200  if char.isdigit():
201  self.advance()
202  elif char == "." and not found_dot:
203  found_dot = True
204  self.advance()
205  else:
206  break
207  else:
208  break
209  self.add_token(Token.NUMBER)
210  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.consume_single_char (   self)

Definition at line 146 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.add_token(), TableFunctionsFactory_parser.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), TableFunctionsFactory_parser.Tokenize.peek(), and TableFunctionsFactory_parser.Tokenize.raise_tokenize_error().

Referenced by TableFunctionsFactory_parser.Tokenize.tokenize().

147  def consume_single_char(self):
148  char = self.peek()
149  if char == "(":
150  self.add_token(Token.LPAR)
151  elif char == ")":
152  self.add_token(Token.RPAR)
153  elif char == "<":
154  self.add_token(Token.LESS)
155  elif char == ">":
156  self.add_token(Token.GREATER)
157  elif char == ",":
158  self.add_token(Token.COMMA)
159  elif char == "=":
160  self.add_token(Token.EQUAL)
161  elif char == "|":
162  self.add_token(Token.VBAR)
163  elif char == "!":
164  self.add_token(Token.BANG)
165  elif char == "[":
166  self.add_token(Token.LSQB)
167  elif char == "]":
168  self.add_token(Token.RSQB)
169  elif char == ":":
170  self.add_token(Token.COLON)
171  else:
172  self.raise_tokenize_error()
173  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.consume_string (   self)
STRING: \".*?\"

Definition at line 177 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.add_token(), TableFunctionsFactory_parser.Tokenize.advance(), anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance(), TableFunctionsFactory_parser.Tokenize.lookahead(), and TableFunctionsFactory_parser.Tokenize.peek().

Referenced by TableFunctionsFactory_parser.Tokenize.tokenize().

178  def consume_string(self):
179  """
180  STRING: \".*?\"
181  """
182  while True:
183  char = self.lookahead()
184  curr = self.peek()
185  if char == '"' and curr != '\\':
186  self.advance()
187  break
188  self.advance()
189  self.add_token(Token.STRING)
190  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.consume_whitespace (   self)

Definition at line 174 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.advance(), and anonymous_namespace{RelAlgDag.cpp}::RANodeIterator.advance().

Referenced by TableFunctionsFactory_parser.Tokenize.tokenize().

175  def consume_whitespace(self):
176  self.advance()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.current_token (   self)

Definition at line 115 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.curr, TableFunctionsFactory_node.UdtfNode.line, TableFunctionsFactory_parser.Tokenize.line(), TableFunctionsFactory_parser.Parser.line, foreign_storage::Interval< T >.start, JoinColumnIterator.start, TableFunctionsFactory_parser.Tokenize.start, ai.heavy.jdbc.HeavyAIEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, and import_export::ImportStatus.start.

Referenced by TableFunctionsFactory_parser.Parser.consume(), TableFunctionsFactory_parser.Tokenize.consume_identifier_or_boolean(), TableFunctionsFactory_parser.Parser.expect(), TableFunctionsFactory_parser.Parser.match(), TableFunctionsFactory_parser.Parser.parse_annotation(), and TableFunctionsFactory_parser.Parser.raise_parser_error().

116  def current_token(self):
117  return self.line[self.start:self.curr + 1]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.is_alpha (   self)

Definition at line 237 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.peek().

238  def is_alpha(self):
239  return self.peek().isalpha()

+ Here is the call graph for this function:

def TableFunctionsFactory_parser.Tokenize.is_at_end (   self)

Definition at line 112 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.curr, TableFunctionsFactory_node.UdtfNode.line, TableFunctionsFactory_parser.Tokenize.line(), and TableFunctionsFactory_parser.Parser.line.

Referenced by TableFunctionsFactory_parser.Parser.parse_annotation(), TableFunctionsFactory_parser.Parser.parse_arg(), TableFunctionsFactory_parser.Parser.parse_args(), TableFunctionsFactory_parser.Parser.parse_templates(), TableFunctionsFactory_parser.Parser.parse_type(), TableFunctionsFactory_parser.Parser.parse_udtf(), and TableFunctionsFactory_parser.Tokenize.tokenize().

113  def is_at_end(self):
114  return len(self.line) == self.curr

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.is_number (   self)

Definition at line 233 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.lookahead(), and TableFunctionsFactory_parser.Tokenize.peek().

Referenced by TableFunctionsFactory_parser.Tokenize.tokenize().

234  def is_number(self):
235  return self.peek().isdigit() or (self.peek() == '-' \
236  and self.lookahead().isdigit())

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.is_token_identifier_or_boolean (   self)

Definition at line 227 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.peek().

Referenced by TableFunctionsFactory_parser.Tokenize.tokenize().

229  return self.peek().isalpha() or self.peek() == "_"

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.is_token_string (   self)

Definition at line 230 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.peek().

Referenced by TableFunctionsFactory_parser.Tokenize.tokenize().

231  def is_token_string(self):
232  return self.peek() == '"'

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.is_token_whitespace (   self)

Definition at line 240 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.peek().

Referenced by TableFunctionsFactory_parser.Tokenize.tokenize().

241  def is_token_whitespace(self):
242  return self.peek().isspace()

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.line (   self)

Definition at line 88 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize._line.

Referenced by TableFunctionsFactory_parser.Tokenize.add_token(), TableFunctionsFactory_parser.Tokenize.current_token(), TableFunctionsFactory_parser.Tokenize.is_at_end(), TableFunctionsFactory_parser.Tokenize.lookahead(), TableFunctionsFactory_parser.Tokenize.peek(), and TableFunctionsFactory_parser.Tokenize.raise_tokenize_error().

88 
89  def line(self):
90  return self._line

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.lookahead (   self)

Definition at line 122 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.curr, TableFunctionsFactory_node.UdtfNode.line, TableFunctionsFactory_parser.Tokenize.line(), and TableFunctionsFactory_parser.Parser.line.

Referenced by TableFunctionsFactory_parser.Tokenize.consume_double_char(), TableFunctionsFactory_parser.Tokenize.consume_identifier_or_boolean(), TableFunctionsFactory_parser.Tokenize.consume_number(), TableFunctionsFactory_parser.Tokenize.consume_string(), TableFunctionsFactory_parser.Tokenize.is_number(), and TableFunctionsFactory_parser.Parser.parse_arg().

123  def lookahead(self):
124  if self.curr + 1 >= len(self.line):
125  return None
126  return self.line[self.curr + 1]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.peek (   self)

Definition at line 130 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.curr, TableFunctionsFactory_node.UdtfNode.line, TableFunctionsFactory_parser.Tokenize.line(), and TableFunctionsFactory_parser.Parser.line.

Referenced by TableFunctionsFactory_parser.Tokenize.can_token_be_double_char(), TableFunctionsFactory_parser.Tokenize.consume_single_char(), TableFunctionsFactory_parser.Tokenize.consume_string(), TableFunctionsFactory_parser.Tokenize.is_alpha(), TableFunctionsFactory_parser.Tokenize.is_number(), TableFunctionsFactory_parser.Tokenize.is_token_identifier_or_boolean(), TableFunctionsFactory_parser.Tokenize.is_token_string(), TableFunctionsFactory_parser.Tokenize.is_token_whitespace(), and TableFunctionsFactory_parser.Tokenize.raise_tokenize_error().

131  def peek(self):
132  return self.line[self.curr]

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.raise_tokenize_error (   self)

Definition at line 243 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.curr, TableFunctionsFactory_node.UdtfNode.line, TableFunctionsFactory_parser.Tokenize.line(), TableFunctionsFactory_parser.Parser.line, and TableFunctionsFactory_parser.Tokenize.peek().

Referenced by TableFunctionsFactory_parser.Tokenize.consume_double_char(), and TableFunctionsFactory_parser.Tokenize.consume_single_char().

244  def raise_tokenize_error(self):
245  curr = self.curr
246  char = self.peek()
247  raise TokenizeException(
248  'Could not match char "%s" at pos %d on line\n %s' % (char, curr, self.line)
249  )
250 

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

def TableFunctionsFactory_parser.Tokenize.tokenize (   self)

Definition at line 95 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize.can_token_be_double_char(), TableFunctionsFactory_parser.Tokenize.consume_double_char(), TableFunctionsFactory_parser.Tokenize.consume_identifier_or_boolean(), TableFunctionsFactory_parser.Tokenize.consume_number(), TableFunctionsFactory_parser.Tokenize.consume_single_char(), TableFunctionsFactory_parser.Tokenize.consume_string(), TableFunctionsFactory_parser.Tokenize.consume_whitespace(), TableFunctionsFactory_parser.Tokenize.curr, TableFunctionsFactory_parser.Tokenize.is_at_end(), TableFunctionsFactory_parser.Tokenize.is_number(), TableFunctionsFactory_parser.Tokenize.is_token_identifier_or_boolean(), TableFunctionsFactory_parser.Tokenize.is_token_string(), TableFunctionsFactory_parser.Tokenize.is_token_whitespace(), foreign_storage::Interval< T >.start, JoinColumnIterator.start, TableFunctionsFactory_parser.Tokenize.start, ai.heavy.jdbc.HeavyAIEscapeParser.Pair.start, JoinColumnTyped::Slice.start, JoinColumnTuple::Slice.start, and import_export::ImportStatus.start.

95 
96  def tokenize(self):
97  while not self.is_at_end():
98  self.start = self.curr
99 
100  if self.is_token_whitespace():
101  self.consume_whitespace()
102  elif self.is_number():
103  self.consume_number()
104  elif self.is_token_string():
105  self.consume_string()
106  elif self.is_token_identifier_or_boolean():
108  elif self.can_token_be_double_char():
109  self.consume_double_char()
110  else:
111  self.consume_single_char()

+ Here is the call graph for this function:

def TableFunctionsFactory_parser.Tokenize.tokens (   self)

Definition at line 92 of file TableFunctionsFactory_parser.py.

References TableFunctionsFactory_parser.Tokenize._tokens.

Referenced by TableFunctionsFactory_parser.Parser.raise_parser_error().

92 
93  def tokens(self):
94  return self._tokens

+ Here is the caller graph for this function:

Member Data Documentation

TableFunctionsFactory_parser.Tokenize._line
private

Definition at line 81 of file TableFunctionsFactory_parser.py.

Referenced by TableFunctionsFactory_parser.Tokenize.line().

TableFunctionsFactory_parser.Tokenize._tokens
private

Definition at line 82 of file TableFunctionsFactory_parser.py.

Referenced by TableFunctionsFactory_parser.Parser.consume(), TableFunctionsFactory_parser.Parser.current_token(), TableFunctionsFactory_parser.Parser.expect(), TableFunctionsFactory_parser.Parser.is_at_end(), TableFunctionsFactory_parser.Parser.lookahead(), TableFunctionsFactory_parser.Tokenize.tokens(), and TableFunctionsFactory_parser.Parser.tokens().

TableFunctionsFactory_parser.Tokenize.curr

Definition at line 84 of file TableFunctionsFactory_parser.py.

Referenced by TableFunctionsFactory_parser.Tokenize.add_token(), TableFunctionsFactory_parser.Tokenize.advance(), TableFunctionsFactory_parser.Tokenize.current_token(), TableFunctionsFactory_parser.Tokenize.is_at_end(), TableFunctionsFactory_parser.Tokenize.lookahead(), TableFunctionsFactory_parser.Tokenize.peek(), TableFunctionsFactory_parser.Tokenize.raise_tokenize_error(), and TableFunctionsFactory_parser.Tokenize.tokenize().

TableFunctionsFactory_parser.Tokenize.start

Definition at line 83 of file TableFunctionsFactory_parser.py.

Referenced by TableFunctionsFactory_parser.Tokenize.add_token(), TableFunctionsFactory_parser.Tokenize.current_token(), and TableFunctionsFactory_parser.Tokenize.tokenize().


The documentation for this class was generated from the following file: