The shlex
module provides a simple lexer (also known as tokenizer) for
languages based on the Unix shell syntax. Its use is demonstrated in Example 5-19.
Example 5-19. Using the shlex Module
File: shlex-example-1.py import shlex lexer = shlex.shlex(open("samples/sample.netrc", "r")) lexer.wordchars = lexer.wordchars + "._" while 1: token = lexer.get_token() if not token: break print repr(token)'machine'
'secret.fbi'
'login'
'mulder'
'password'
'trustno1'
'machine'
'non.secret.fbi'
'login'
'scully'
'password'
'noway'
3.129.70.185