alex-3.2.5/ 0000755 0000000 0000000 00000000000 07346545000 010643 5 ustar 00 0000000 0000000 alex-3.2.5/CHANGELOG.md 0000755 0000000 0000000 00000005147 07346545000 012466 0 ustar 00 0000000 0000000 ## Changes in 3.2.5:
* Build fixes for GHC 8.8.x
## Changes in 3.2.4:
* Remove dependency on QuickCheck
* Change the way that bootstrapping is done: see README.md for build
instructions
## Changes in 3.2.3:
* fix issue when using cpphs (#116)
## Changes in 3.2.2:
* Manage line length in generated files [GH-84]
* Fix issue when identifier with multiple single quotes, e.g. `foo''` was used
* Allow omitting spaces around `=` in macro definitions
* Include pre-generated Parser.hs and Scan.hs in the Hackage upload, to
make bootstrapping easier.
## Changes in 3.2.1:
* Fix build problem with GHC; add new test tokens_scan_user.x
## Changes in 3.2.0:
* Allow the token type and productions to be overloaded, and add new
directives: %token, %typeclass, %action. See "Type Signatures and
Typeclasses" in the manual.
* Some small space leak fixes
## Changes in 3.1.7:
* Add support for `%encoding` directive
(allows to control `--latin1` from inside Alex scripts)
* Make code forward-compatible with in-progress proposals
* Suppress more warnings
## Changes in 3.1.6:
* `sdist` for 3.1.5 was mis-generated, causing it to ask for Happy
when building.
## Changes in 3.1.5:
* Generate less warning-laden code, and suppress other warnings.
* Bug fixes.
## Changes in 3.1.4:
* Add Applicative/Functor instances for GHC 7.10
## Changes in 3.1.3:
* Fix for clang (XCode 5)
## Changes in 3.1.2:
* Add missing file to extra-source-files
## Changes in 3.1.1:
* Bug fixes (#24, #30, #31, #32)
## Changes in 3.1.0:
* necessary changes to work with GHC 7.8.1
## Changes in 3.0 (since 2.3.5)
* Unicode support (contributed mostly by Jean-Philippe Bernardy,
with help from Alan Zimmerman).
* An Alex lexer now takes a UTF-8 encoded byte sequence as input
(see Section 5.1, “Unicode and UTF-8”. If you are using the
"basic" wrapper or one of the other wrappers that takes a
Haskell String as input, the string is automatically encoded
into UTF-8 by Alex. If your input is a ByteString, you are
responsible for ensuring that the input is UTF-8 encoded. The
old 8-bit behaviour is still available via the --latin1
option.
* Alex source files are assumed to be in UTF-8, like Haskell
source files. The lexer specification can use Unicode
characters and ranges.
* `alexGetChar` is renamed to `alexGetByte` in the generated code.
* There is a new option, `--latin1`, that restores the old
behaviour.
* Alex now does DFA minimization, which helps to reduce the size
of the generated tables, especially for lexers that use Unicode.
alex-3.2.5/LICENSE 0000644 0000000 0000000 00000003017 07346545000 011651 0 ustar 00 0000000 0000000 Copyright (c) 1995-2011, Chris Dornan and Simon Marlow
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above
copyright notice, this list of conditions and the following
disclaimer in the documentation and/or other materials provided
with the distribution.
* Neither the name of the copyright holders, nor the names of the
contributors may be used to endorse or promote products derived
from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
alex-3.2.5/README.md 0000755 0000000 0000000 00000006270 07346545000 012132 0 ustar 00 0000000 0000000 # Alex: A Lexical Analyser Generator
[](http://travis-ci.org/simonmar/alex)
Alex is a Lex-like tool for generating Haskell scanners. For complete
documentation, see the doc directory.
-
-
Alex version 2.0 has changed fairly considerably since version 1.x,
and the syntax is almost completely different. For a detailed list of
changes, see the release notes in the documentation.
Alex is now covered by a BSD-Style licence; see the licence file in
the 'doc' directory for details.
The sources are in the 'src' directory and the documentation in the 'doc'
directory; various examples are in the 'examples' subdirectory.
The source code in the 'src' and 'examples' directories is intended
for a Haskell 98 compiler with hierarchical modules. It should work
with GHC >= 5.04.
## Build Instructions
If you just want to *use* Alex, you can download or install (via
`cabal install alex`) an
[Alex release from Hackage](https://hackage.haskell.org/package/alex); also note that
distributions such as the
[Haskell Platform](https://www.haskell.org/platform/) and other package
manager-based distributions provide packages for Alex. Moreover,
recent versions of `cabal` will automatically install the required
version of `alex` based on
[`build-tools`/`build-tool-depends` declarations](http://cabal.readthedocs.io/en/latest/developing-packages.html#pkg-field-build-tool-depends).
Read on if you want to build Alex directly from Git.
Alex is built using GHC & Cabal; so first install
[GHC](https://www.haskell.org/ghc) and
[`cabal-install-2.0`](https://www.haskell.org/cabal) (or later).
Since Alex itself is implemented in terms of an Alex scanner,
bootstrapping Alex is a bit tricky:
You need to have the build-tools `alex` and `happy` manually
installed; either via your system package manager distribution, the
Haskell Platform, or e.g. via (run this outside the Git repository!):
$ cabal install alex happy
which installs them into `${HOME}/.cabal/bin` by default (make sure
they're in your `$PATH` for the next steps!).
### Variant A
First you need to generate the pre-processed templates via
$ cabal new-run gen-alex-sdist
(otherwise `cabal install` will complain about
"`data/AlexTemplate: copyFile: does not exist (No such file or directory)`")
And then you can install `alex` simply by invoking
$ cabal install
from inside the Git folder.
### Variant B
Alternatively, you can use the `Makefile` which automates the steps of
producing a self-contained pre-bootstrapped source distribution with
pre-generated lexer/scanners (and which also performs the `cabal
new-run gen-alex-sdist` pre-preprocessing step):
$ make sdist
$ cabal install dist/alex-*.tar.gz
For convenience, there's also a `make sdist-test` target which builds the
source source tarball and runs the test-suite from within the source dist.
## Contributing & Reporting Issues
Please report any bugs or comments at https://github.com/simonmar/alex/issues
Share and enjoy,
Chris Dornan: cdornan@arm.com
Isaac Jones: ijones@syntaxpolice.org
Simon Marlow: simonmar@microsoft.com
alex-3.2.5/Setup.hs 0000644 0000000 0000000 00000000056 07346545000 012300 0 ustar 00 0000000 0000000 import Distribution.Simple
main = defaultMain
alex-3.2.5/TODO 0000755 0000000 0000000 00000001176 07346545000 011343 0 ustar 00 0000000 0000000 - Option for pure Haskell 98 output?
- maybe Haskell 2010 at this point?
- how about an option to use Data.Array.Unboxed?
- Put in {-# LINE #-} pragmas for token actions
- Prune states that aren't reachable?
- Issue a warning for tokens that can't be generated?
- Info file?
- start codes
- accepting states
- More compact lexer table encoding:
- equivalence classes?
- Improve performance of Alex itself
- AlexEOF doesn't provide a way to get at the text position of the EOF.
- Allow user-defined wrappers? Wrappers in files relative to the
current directory, for example?
- case-insensitivity option (like flex's -i).
alex-3.2.5/alex.cabal 0000644 0000000 0000000 00000007726 07346545000 012574 0 ustar 00 0000000 0000000 cabal-version: >= 1.8
name: alex
version: 3.2.5
-- don't forget updating changelog.md!
license: BSD3
license-file: LICENSE
copyright: (c) Chis Dornan, Simon Marlow
author: Chris Dornan and Simon Marlow
maintainer: Simon Marlow
bug-reports: https://github.com/simonmar/alex/issues
stability: stable
homepage: http://www.haskell.org/alex/
synopsis: Alex is a tool for generating lexical analysers in Haskell
description:
Alex is a tool for generating lexical analysers in Haskell.
It takes a description of tokens based on regular
expressions and generates a Haskell module containing code
for scanning text efficiently. It is similar to the tool
lex or flex for C/C++.
category: Development
build-type: Simple
data-dir: data/
data-files:
AlexTemplate
AlexTemplate-ghc
AlexTemplate-ghc-nopred
AlexTemplate-ghc-debug
AlexTemplate-debug
AlexWrapper-basic
AlexWrapper-basic-bytestring
AlexWrapper-strict-bytestring
AlexWrapper-posn
AlexWrapper-posn-bytestring
AlexWrapper-monad
AlexWrapper-monad-bytestring
AlexWrapper-monadUserState
AlexWrapper-monadUserState-bytestring
AlexWrapper-gscan
extra-source-files:
CHANGELOG.md
README.md
TODO
alex.spec
doc/Makefile
doc/aclocal.m4
doc/alex.1.in
doc/alex.xml
doc/config.mk.in
doc/configure.ac
doc/docbook-xml.mk
doc/fptools.css
examples/Makefile
examples/Tokens.x
examples/Tokens_gscan.x
examples/Tokens_posn.x
examples/examples.x
examples/haskell.x
examples/lit.x
examples/pp.x
examples/state.x
examples/tiny.y
examples/words.x
examples/words_monad.x
examples/words_posn.x
src/Parser.y.boot
src/Scan.x.boot
src/ghc_hooks.c
templates/GenericTemplate.hs
templates/wrappers.hs
tests/Makefile
tests/simple.x
tests/null.x
tests/tokens.x
tests/tokens_gscan.x
tests/tokens_posn.x
tests/tokens_bytestring.x
tests/tokens_posn_bytestring.x
tests/tokens_scan_user.x
tests/tokens_strict_bytestring.x
tests/tokens_monad_bytestring.x
tests/tokens_monadUserState_bytestring.x
tests/tokens_bytestring_unicode.x
tests/basic_typeclass.x
tests/basic_typeclass_bytestring.x
tests/default_typeclass.x
tests/gscan_typeclass.x
tests/posn_typeclass.x
tests/monad_typeclass.x
tests/monad_typeclass_bytestring.x
tests/monadUserState_typeclass.x
tests/monadUserState_typeclass_bytestring.x
tests/posn_typeclass_bytestring.x
tests/strict_typeclass.x
tests/unicode.x
source-repository head
type: git
location: https://github.com/simonmar/alex.git
flag small_base
description: Choose the new smaller, split-up base package.
executable alex
hs-source-dirs: src
main-is: Main.hs
if flag(small_base)
build-depends: base >= 2.1, array, containers, directory
else
build-depends: base >= 1.0
build-depends: base < 5
extensions: CPP
ghc-options: -Wall -rtsopts
other-modules:
AbsSyn
CharSet
DFA
DFAMin
DFS
Info
Map
NFA
Output
Paths_alex
Parser
ParseMonad
Scan
Set
Sort
Util
UTF8
Data.Ranged
Data.Ranged.Boundaries
Data.Ranged.RangedSet
Data.Ranged.Ranges
test-suite tests
type: exitcode-stdio-1.0
main-is: test.hs
-- This line is important as it ensures that the local `exe:alex` component declared above is built before the test-suite component is invoked, as well as making sure that `alex` is made available on $PATH and `$alex_datadir` is set accordingly before invoking `test.hs`
build-tools: alex
build-depends: base, process
alex-3.2.5/alex.spec 0000755 0000000 0000000 00000002422 07346545000 012453 0 ustar 00 0000000 0000000 %define name alex
%define version 2.2
%define release 1
Name: %{name}
Version: %{version}
Release: %{release}
License: BSD-like
Group: Development/Languages/Haskell
URL: http://haskell.org/alex/
Source: http://haskell.org/alex/dist/%{version}/alex-%{version}.tar.gz
Packager: Sven Panne
BuildRoot: %{_tmppath}/%{name}-%{version}-build
Prefix: %{_prefix}
BuildRequires: happy, ghc, docbook-dtd, docbook-xsl-stylesheets, libxslt, libxml2, fop, xmltex, dvips
Summary: The lexer generator for Haskell
%description
Alex is a tool for generating lexical analysers in Haskell, given a
description of the tokens to be recognised in the form of regular
expressions. It is similar to the tool lex or flex for C/C++.
%prep
%setup -n alex-%{version}
%build
runhaskell Setup.lhs configure --prefix=%{_prefix} --docdir=%{_datadir}/doc/packages/%{name}
runhaskell Setup.lhs build
cd doc
test -f configure || autoreconf
./configure
make html
%install
runhaskell Setup.lhs copy --destdir=${RPM_BUILD_ROOT}
%clean
rm -rf ${RPM_BUILD_ROOT}
%files
%defattr(-,root,root)
%doc ANNOUNCE
%doc LICENSE
%doc README
%doc TODO
%doc doc/alex
%doc examples
%{prefix}/bin/alex
%{prefix}/share/alex-%{version}
alex-3.2.5/data/ 0000755 0000000 0000000 00000000000 07346545000 011554 5 ustar 00 0000000 0000000 alex-3.2.5/data/AlexTemplate 0000644 0000000 0000000 00000010514 07346545000 014065 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/GenericTemplate.hs" #-}
-- -----------------------------------------------------------------------------
-- ALEX TEMPLATE
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
-- -----------------------------------------------------------------------------
-- INTERNALS and main scanner engine
alexIndexInt16OffAddr arr off = arr ! off
alexIndexInt32OffAddr arr off = arr ! off
quickIndex arr i = arr ! i
-- -----------------------------------------------------------------------------
-- Main lexing routines
data AlexReturn a
= AlexEOF
| AlexError !AlexInput
| AlexSkip !AlexInput !Int
| AlexToken !AlexInput !Int a
-- alexScan :: AlexInput -> StartCode -> AlexReturn a
alexScan input__ (sc)
= alexScanUser undefined input__ (sc)
alexScanUser user__ input__ (sc)
= case alex_scan_tkn user__ input__ (0) input__ sc AlexNone of
(AlexNone, input__') ->
case alexGetByte input__ of
Nothing ->
AlexEOF
Just _ ->
AlexError input__'
(AlexLastSkip input__'' len, _) ->
AlexSkip input__'' len
(AlexLastAcc k input__''' len, _) ->
AlexToken input__''' len (alex_actions ! k)
-- Push the input through the DFA, remembering the most recent accepting
-- state it encountered.
alex_scan_tkn user__ orig_input len input__ s last_acc =
input__ `seq` -- strict in the input
let
new_acc = (check_accs (alex_accept `quickIndex` (s)))
in
new_acc `seq`
case alexGetByte input__ of
Nothing -> (new_acc, input__)
Just (c, new_input) ->
case fromIntegral c of { (ord_c) ->
let
base = alexIndexInt32OffAddr alex_base s
offset = (base + ord_c)
check = alexIndexInt16OffAddr alex_check offset
new_s = if (offset >= (0)) && (check == ord_c)
then alexIndexInt16OffAddr alex_table offset
else alexIndexInt16OffAddr alex_deflt s
in
case new_s of
(-1) -> (new_acc, input__)
-- on an error, we want to keep the input *before* the
-- character that failed, not after.
_ -> alex_scan_tkn user__ orig_input (if c < 0x80 || c >= 0xC0 then (len + (1)) else len)
-- note that the length is increased ONLY if this is the 1st byte in a char encoding)
new_input new_s new_acc
}
where
check_accs (AlexAccNone) = last_acc
check_accs (AlexAcc a ) = AlexLastAcc a input__ (len)
check_accs (AlexAccSkip) = AlexLastSkip input__ (len)
check_accs (AlexAccPred a predx rest)
| predx user__ orig_input (len) input__
= AlexLastAcc a input__ (len)
| otherwise
= check_accs rest
check_accs (AlexAccSkipPred predx rest)
| predx user__ orig_input (len) input__
= AlexLastSkip input__ (len)
| otherwise
= check_accs rest
data AlexLastAcc
= AlexNone
| AlexLastAcc !Int !AlexInput !Int
| AlexLastSkip !AlexInput !Int
data AlexAcc user
= AlexAccNone
| AlexAcc Int
| AlexAccSkip
| AlexAccPred Int (AlexAccPred user) (AlexAcc user)
| AlexAccSkipPred (AlexAccPred user) (AlexAcc user)
type AlexAccPred user = user -> AlexInput -> Int -> AlexInput -> Bool
-- -----------------------------------------------------------------------------
-- Predicates on a rule
alexAndPred p1 p2 user__ in1 len in2
= p1 user__ in1 len in2 && p2 user__ in1 len in2
--alexPrevCharIsPred :: Char -> AlexAccPred _
alexPrevCharIs c _ input__ _ _ = c == alexInputPrevChar input__
alexPrevCharMatches f _ input__ _ _ = f (alexInputPrevChar input__)
--alexPrevCharIsOneOfPred :: Array Char Bool -> AlexAccPred _
alexPrevCharIsOneOf arr _ input__ _ _ = arr ! alexInputPrevChar input__
--alexRightContext :: Int -> AlexAccPred _
alexRightContext (sc) user__ _ _ input__ =
case alex_scan_tkn user__ input__ (0) input__ sc AlexNone of
(AlexNone, _) -> False
_ -> True
-- TODO: there's no need to find the longest
-- match when checking the right context, just
-- the first match will do.
alex-3.2.5/data/AlexTemplate-debug 0000644 0000000 0000000 00000011052 07346545000 015147 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/GenericTemplate.hs" #-}
-- -----------------------------------------------------------------------------
-- ALEX TEMPLATE
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
-- -----------------------------------------------------------------------------
-- INTERNALS and main scanner engine
alexIndexInt16OffAddr arr off = arr ! off
alexIndexInt32OffAddr arr off = arr ! off
quickIndex arr i = arr ! i
-- -----------------------------------------------------------------------------
-- Main lexing routines
data AlexReturn a
= AlexEOF
| AlexError !AlexInput
| AlexSkip !AlexInput !Int
| AlexToken !AlexInput !Int a
-- alexScan :: AlexInput -> StartCode -> AlexReturn a
alexScan input__ (sc)
= alexScanUser undefined input__ (sc)
alexScanUser user__ input__ (sc)
= case alex_scan_tkn user__ input__ (0) input__ sc AlexNone of
(AlexNone, input__') ->
case alexGetByte input__ of
Nothing ->
trace ("End of input.") $
AlexEOF
Just _ ->
trace ("Error.") $
AlexError input__'
(AlexLastSkip input__'' len, _) ->
trace ("Skipping.") $
AlexSkip input__'' len
(AlexLastAcc k input__''' len, _) ->
trace ("Accept.") $
AlexToken input__''' len (alex_actions ! k)
-- Push the input through the DFA, remembering the most recent accepting
-- state it encountered.
alex_scan_tkn user__ orig_input len input__ s last_acc =
input__ `seq` -- strict in the input
let
new_acc = (check_accs (alex_accept `quickIndex` (s)))
in
new_acc `seq`
case alexGetByte input__ of
Nothing -> (new_acc, input__)
Just (c, new_input) ->
trace ("State: " ++ show (s) ++ ", char: " ++ show c) $
case fromIntegral c of { (ord_c) ->
let
base = alexIndexInt32OffAddr alex_base s
offset = (base + ord_c)
check = alexIndexInt16OffAddr alex_check offset
new_s = if (offset >= (0)) && (check == ord_c)
then alexIndexInt16OffAddr alex_table offset
else alexIndexInt16OffAddr alex_deflt s
in
case new_s of
(-1) -> (new_acc, input__)
-- on an error, we want to keep the input *before* the
-- character that failed, not after.
_ -> alex_scan_tkn user__ orig_input (if c < 0x80 || c >= 0xC0 then (len + (1)) else len)
-- note that the length is increased ONLY if this is the 1st byte in a char encoding)
new_input new_s new_acc
}
where
check_accs (AlexAccNone) = last_acc
check_accs (AlexAcc a ) = AlexLastAcc a input__ (len)
check_accs (AlexAccSkip) = AlexLastSkip input__ (len)
check_accs (AlexAccPred a predx rest)
| predx user__ orig_input (len) input__
= AlexLastAcc a input__ (len)
| otherwise
= check_accs rest
check_accs (AlexAccSkipPred predx rest)
| predx user__ orig_input (len) input__
= AlexLastSkip input__ (len)
| otherwise
= check_accs rest
data AlexLastAcc
= AlexNone
| AlexLastAcc !Int !AlexInput !Int
| AlexLastSkip !AlexInput !Int
data AlexAcc user
= AlexAccNone
| AlexAcc Int
| AlexAccSkip
| AlexAccPred Int (AlexAccPred user) (AlexAcc user)
| AlexAccSkipPred (AlexAccPred user) (AlexAcc user)
type AlexAccPred user = user -> AlexInput -> Int -> AlexInput -> Bool
-- -----------------------------------------------------------------------------
-- Predicates on a rule
alexAndPred p1 p2 user__ in1 len in2
= p1 user__ in1 len in2 && p2 user__ in1 len in2
--alexPrevCharIsPred :: Char -> AlexAccPred _
alexPrevCharIs c _ input__ _ _ = c == alexInputPrevChar input__
alexPrevCharMatches f _ input__ _ _ = f (alexInputPrevChar input__)
--alexPrevCharIsOneOfPred :: Array Char Bool -> AlexAccPred _
alexPrevCharIsOneOf arr _ input__ _ _ = arr ! alexInputPrevChar input__
--alexRightContext :: Int -> AlexAccPred _
alexRightContext (sc) user__ _ _ input__ =
case alex_scan_tkn user__ input__ (0) input__ sc AlexNone of
(AlexNone, _) -> False
_ -> True
-- TODO: there's no need to find the longest
-- match when checking the right context, just
-- the first match will do.
alex-3.2.5/data/AlexTemplate-ghc 0000644 0000000 0000000 00000013467 07346545000 014636 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/GenericTemplate.hs" #-}
-- -----------------------------------------------------------------------------
-- ALEX TEMPLATE
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
-- -----------------------------------------------------------------------------
-- INTERNALS and main scanner engine
-- Do not remove this comment. Required to fix CPP parsing when using GCC and a clang-compiled alex.
#if __GLASGOW_HASKELL__ > 706
#define GTE(n,m) (tagToEnum# (n >=# m))
#define EQ(n,m) (tagToEnum# (n ==# m))
#else
#define GTE(n,m) (n >=# m)
#define EQ(n,m) (n ==# m)
#endif
data AlexAddr = AlexA# Addr#
-- Do not remove this comment. Required to fix CPP parsing when using GCC and a clang-compiled alex.
#if __GLASGOW_HASKELL__ < 503
uncheckedShiftL# = shiftL#
#endif
{-# INLINE alexIndexInt16OffAddr #-}
alexIndexInt16OffAddr (AlexA# arr) off =
#ifdef WORDS_BIGENDIAN
narrow16Int# i
where
i = word2Int# ((high `uncheckedShiftL#` 8#) `or#` low)
high = int2Word# (ord# (indexCharOffAddr# arr (off' +# 1#)))
low = int2Word# (ord# (indexCharOffAddr# arr off'))
off' = off *# 2#
#else
indexInt16OffAddr# arr off
#endif
{-# INLINE alexIndexInt32OffAddr #-}
alexIndexInt32OffAddr (AlexA# arr) off =
#ifdef WORDS_BIGENDIAN
narrow32Int# i
where
i = word2Int# ((b3 `uncheckedShiftL#` 24#) `or#`
(b2 `uncheckedShiftL#` 16#) `or#`
(b1 `uncheckedShiftL#` 8#) `or#` b0)
b3 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 3#)))
b2 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 2#)))
b1 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 1#)))
b0 = int2Word# (ord# (indexCharOffAddr# arr off'))
off' = off *# 4#
#else
indexInt32OffAddr# arr off
#endif
#if __GLASGOW_HASKELL__ < 503
quickIndex arr i = arr ! i
#else
-- GHC >= 503, unsafeAt is available from Data.Array.Base.
quickIndex = unsafeAt
#endif
-- -----------------------------------------------------------------------------
-- Main lexing routines
data AlexReturn a
= AlexEOF
| AlexError !AlexInput
| AlexSkip !AlexInput !Int
| AlexToken !AlexInput !Int a
-- alexScan :: AlexInput -> StartCode -> AlexReturn a
alexScan input__ (I# (sc))
= alexScanUser undefined input__ (I# (sc))
alexScanUser user__ input__ (I# (sc))
= case alex_scan_tkn user__ input__ 0# input__ sc AlexNone of
(AlexNone, input__') ->
case alexGetByte input__ of
Nothing ->
AlexEOF
Just _ ->
AlexError input__'
(AlexLastSkip input__'' len, _) ->
AlexSkip input__'' len
(AlexLastAcc k input__''' len, _) ->
AlexToken input__''' len (alex_actions ! k)
-- Push the input through the DFA, remembering the most recent accepting
-- state it encountered.
alex_scan_tkn user__ orig_input len input__ s last_acc =
input__ `seq` -- strict in the input
let
new_acc = (check_accs (alex_accept `quickIndex` (I# (s))))
in
new_acc `seq`
case alexGetByte input__ of
Nothing -> (new_acc, input__)
Just (c, new_input) ->
case fromIntegral c of { (I# (ord_c)) ->
let
base = alexIndexInt32OffAddr alex_base s
offset = (base +# ord_c)
check = alexIndexInt16OffAddr alex_check offset
new_s = if GTE(offset,0#) && EQ(check,ord_c)
then alexIndexInt16OffAddr alex_table offset
else alexIndexInt16OffAddr alex_deflt s
in
case new_s of
-1# -> (new_acc, input__)
-- on an error, we want to keep the input *before* the
-- character that failed, not after.
_ -> alex_scan_tkn user__ orig_input (if c < 0x80 || c >= 0xC0 then (len +# 1#) else len)
-- note that the length is increased ONLY if this is the 1st byte in a char encoding)
new_input new_s new_acc
}
where
check_accs (AlexAccNone) = last_acc
check_accs (AlexAcc a ) = AlexLastAcc a input__ (I# (len))
check_accs (AlexAccSkip) = AlexLastSkip input__ (I# (len))
check_accs (AlexAccPred a predx rest)
| predx user__ orig_input (I# (len)) input__
= AlexLastAcc a input__ (I# (len))
| otherwise
= check_accs rest
check_accs (AlexAccSkipPred predx rest)
| predx user__ orig_input (I# (len)) input__
= AlexLastSkip input__ (I# (len))
| otherwise
= check_accs rest
data AlexLastAcc
= AlexNone
| AlexLastAcc !Int !AlexInput !Int
| AlexLastSkip !AlexInput !Int
data AlexAcc user
= AlexAccNone
| AlexAcc Int
| AlexAccSkip
| AlexAccPred Int (AlexAccPred user) (AlexAcc user)
| AlexAccSkipPred (AlexAccPred user) (AlexAcc user)
type AlexAccPred user = user -> AlexInput -> Int -> AlexInput -> Bool
-- -----------------------------------------------------------------------------
-- Predicates on a rule
alexAndPred p1 p2 user__ in1 len in2
= p1 user__ in1 len in2 && p2 user__ in1 len in2
--alexPrevCharIsPred :: Char -> AlexAccPred _
alexPrevCharIs c _ input__ _ _ = c == alexInputPrevChar input__
alexPrevCharMatches f _ input__ _ _ = f (alexInputPrevChar input__)
--alexPrevCharIsOneOfPred :: Array Char Bool -> AlexAccPred _
alexPrevCharIsOneOf arr _ input__ _ _ = arr ! alexInputPrevChar input__
--alexRightContext :: Int -> AlexAccPred _
alexRightContext (I# (sc)) user__ _ _ input__ =
case alex_scan_tkn user__ input__ 0# input__ sc AlexNone of
(AlexNone, _) -> False
_ -> True
-- TODO: there's no need to find the longest
-- match when checking the right context, just
-- the first match will do.
alex-3.2.5/data/AlexTemplate-ghc-debug 0000644 0000000 0000000 00000014032 07346545000 015707 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/GenericTemplate.hs" #-}
-- -----------------------------------------------------------------------------
-- ALEX TEMPLATE
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
-- -----------------------------------------------------------------------------
-- INTERNALS and main scanner engine
-- Do not remove this comment. Required to fix CPP parsing when using GCC and a clang-compiled alex.
#if __GLASGOW_HASKELL__ > 706
#define GTE(n,m) (tagToEnum# (n >=# m))
#define EQ(n,m) (tagToEnum# (n ==# m))
#else
#define GTE(n,m) (n >=# m)
#define EQ(n,m) (n ==# m)
#endif
data AlexAddr = AlexA# Addr#
-- Do not remove this comment. Required to fix CPP parsing when using GCC and a clang-compiled alex.
#if __GLASGOW_HASKELL__ < 503
uncheckedShiftL# = shiftL#
#endif
{-# INLINE alexIndexInt16OffAddr #-}
alexIndexInt16OffAddr (AlexA# arr) off =
#ifdef WORDS_BIGENDIAN
narrow16Int# i
where
i = word2Int# ((high `uncheckedShiftL#` 8#) `or#` low)
high = int2Word# (ord# (indexCharOffAddr# arr (off' +# 1#)))
low = int2Word# (ord# (indexCharOffAddr# arr off'))
off' = off *# 2#
#else
indexInt16OffAddr# arr off
#endif
{-# INLINE alexIndexInt32OffAddr #-}
alexIndexInt32OffAddr (AlexA# arr) off =
#ifdef WORDS_BIGENDIAN
narrow32Int# i
where
i = word2Int# ((b3 `uncheckedShiftL#` 24#) `or#`
(b2 `uncheckedShiftL#` 16#) `or#`
(b1 `uncheckedShiftL#` 8#) `or#` b0)
b3 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 3#)))
b2 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 2#)))
b1 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 1#)))
b0 = int2Word# (ord# (indexCharOffAddr# arr off'))
off' = off *# 4#
#else
indexInt32OffAddr# arr off
#endif
#if __GLASGOW_HASKELL__ < 503
quickIndex arr i = arr ! i
#else
-- GHC >= 503, unsafeAt is available from Data.Array.Base.
quickIndex = unsafeAt
#endif
-- -----------------------------------------------------------------------------
-- Main lexing routines
data AlexReturn a
= AlexEOF
| AlexError !AlexInput
| AlexSkip !AlexInput !Int
| AlexToken !AlexInput !Int a
-- alexScan :: AlexInput -> StartCode -> AlexReturn a
alexScan input__ (I# (sc))
= alexScanUser undefined input__ (I# (sc))
alexScanUser user__ input__ (I# (sc))
= case alex_scan_tkn user__ input__ 0# input__ sc AlexNone of
(AlexNone, input__') ->
case alexGetByte input__ of
Nothing ->
trace ("End of input.") $
AlexEOF
Just _ ->
trace ("Error.") $
AlexError input__'
(AlexLastSkip input__'' len, _) ->
trace ("Skipping.") $
AlexSkip input__'' len
(AlexLastAcc k input__''' len, _) ->
trace ("Accept.") $
AlexToken input__''' len (alex_actions ! k)
-- Push the input through the DFA, remembering the most recent accepting
-- state it encountered.
alex_scan_tkn user__ orig_input len input__ s last_acc =
input__ `seq` -- strict in the input
let
new_acc = (check_accs (alex_accept `quickIndex` (I# (s))))
in
new_acc `seq`
case alexGetByte input__ of
Nothing -> (new_acc, input__)
Just (c, new_input) ->
trace ("State: " ++ show (I# (s)) ++ ", char: " ++ show c) $
case fromIntegral c of { (I# (ord_c)) ->
let
base = alexIndexInt32OffAddr alex_base s
offset = (base +# ord_c)
check = alexIndexInt16OffAddr alex_check offset
new_s = if GTE(offset,0#) && EQ(check,ord_c)
then alexIndexInt16OffAddr alex_table offset
else alexIndexInt16OffAddr alex_deflt s
in
case new_s of
-1# -> (new_acc, input__)
-- on an error, we want to keep the input *before* the
-- character that failed, not after.
_ -> alex_scan_tkn user__ orig_input (if c < 0x80 || c >= 0xC0 then (len +# 1#) else len)
-- note that the length is increased ONLY if this is the 1st byte in a char encoding)
new_input new_s new_acc
}
where
check_accs (AlexAccNone) = last_acc
check_accs (AlexAcc a ) = AlexLastAcc a input__ (I# (len))
check_accs (AlexAccSkip) = AlexLastSkip input__ (I# (len))
check_accs (AlexAccPred a predx rest)
| predx user__ orig_input (I# (len)) input__
= AlexLastAcc a input__ (I# (len))
| otherwise
= check_accs rest
check_accs (AlexAccSkipPred predx rest)
| predx user__ orig_input (I# (len)) input__
= AlexLastSkip input__ (I# (len))
| otherwise
= check_accs rest
data AlexLastAcc
= AlexNone
| AlexLastAcc !Int !AlexInput !Int
| AlexLastSkip !AlexInput !Int
data AlexAcc user
= AlexAccNone
| AlexAcc Int
| AlexAccSkip
| AlexAccPred Int (AlexAccPred user) (AlexAcc user)
| AlexAccSkipPred (AlexAccPred user) (AlexAcc user)
type AlexAccPred user = user -> AlexInput -> Int -> AlexInput -> Bool
-- -----------------------------------------------------------------------------
-- Predicates on a rule
alexAndPred p1 p2 user__ in1 len in2
= p1 user__ in1 len in2 && p2 user__ in1 len in2
--alexPrevCharIsPred :: Char -> AlexAccPred _
alexPrevCharIs c _ input__ _ _ = c == alexInputPrevChar input__
alexPrevCharMatches f _ input__ _ _ = f (alexInputPrevChar input__)
--alexPrevCharIsOneOfPred :: Array Char Bool -> AlexAccPred _
alexPrevCharIsOneOf arr _ input__ _ _ = arr ! alexInputPrevChar input__
--alexRightContext :: Int -> AlexAccPred _
alexRightContext (I# (sc)) user__ _ _ input__ =
case alex_scan_tkn user__ input__ 0# input__ sc AlexNone of
(AlexNone, _) -> False
_ -> True
-- TODO: there's no need to find the longest
-- match when checking the right context, just
-- the first match will do.
alex-3.2.5/data/AlexTemplate-ghc-nopred 0000644 0000000 0000000 00000010670 07346545000 016114 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/GenericTemplate.hs" #-}
-- -----------------------------------------------------------------------------
-- ALEX TEMPLATE
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
-- -----------------------------------------------------------------------------
-- INTERNALS and main scanner engine
-- Do not remove this comment. Required to fix CPP parsing when using GCC and a clang-compiled alex.
#if __GLASGOW_HASKELL__ > 706
#define GTE(n,m) (tagToEnum# (n >=# m))
#define EQ(n,m) (tagToEnum# (n ==# m))
#else
#define GTE(n,m) (n >=# m)
#define EQ(n,m) (n ==# m)
#endif
data AlexAddr = AlexA# Addr#
-- Do not remove this comment. Required to fix CPP parsing when using GCC and a clang-compiled alex.
#if __GLASGOW_HASKELL__ < 503
uncheckedShiftL# = shiftL#
#endif
{-# INLINE alexIndexInt16OffAddr #-}
alexIndexInt16OffAddr (AlexA# arr) off =
#ifdef WORDS_BIGENDIAN
narrow16Int# i
where
i = word2Int# ((high `uncheckedShiftL#` 8#) `or#` low)
high = int2Word# (ord# (indexCharOffAddr# arr (off' +# 1#)))
low = int2Word# (ord# (indexCharOffAddr# arr off'))
off' = off *# 2#
#else
indexInt16OffAddr# arr off
#endif
{-# INLINE alexIndexInt32OffAddr #-}
alexIndexInt32OffAddr (AlexA# arr) off =
#ifdef WORDS_BIGENDIAN
narrow32Int# i
where
i = word2Int# ((b3 `uncheckedShiftL#` 24#) `or#`
(b2 `uncheckedShiftL#` 16#) `or#`
(b1 `uncheckedShiftL#` 8#) `or#` b0)
b3 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 3#)))
b2 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 2#)))
b1 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 1#)))
b0 = int2Word# (ord# (indexCharOffAddr# arr off'))
off' = off *# 4#
#else
indexInt32OffAddr# arr off
#endif
#if __GLASGOW_HASKELL__ < 503
quickIndex arr i = arr ! i
#else
-- GHC >= 503, unsafeAt is available from Data.Array.Base.
quickIndex = unsafeAt
#endif
-- -----------------------------------------------------------------------------
-- Main lexing routines
data AlexReturn a
= AlexEOF
| AlexError !AlexInput
| AlexSkip !AlexInput !Int
| AlexToken !AlexInput !Int a
-- alexScan :: AlexInput -> StartCode -> AlexReturn a
alexScan input__ (I# (sc))
= alexScanUser undefined input__ (I# (sc))
alexScanUser user__ input__ (I# (sc))
= case alex_scan_tkn user__ input__ 0# input__ sc AlexNone of
(AlexNone, input__') ->
case alexGetByte input__ of
Nothing ->
AlexEOF
Just _ ->
AlexError input__'
(AlexLastSkip input__'' len, _) ->
AlexSkip input__'' len
(AlexLastAcc k input__''' len, _) ->
AlexToken input__''' len (alex_actions ! k)
-- Push the input through the DFA, remembering the most recent accepting
-- state it encountered.
alex_scan_tkn user__ orig_input len input__ s last_acc =
input__ `seq` -- strict in the input
let
new_acc = (check_accs (alex_accept `quickIndex` (I# (s))))
in
new_acc `seq`
case alexGetByte input__ of
Nothing -> (new_acc, input__)
Just (c, new_input) ->
case fromIntegral c of { (I# (ord_c)) ->
let
base = alexIndexInt32OffAddr alex_base s
offset = (base +# ord_c)
check = alexIndexInt16OffAddr alex_check offset
new_s = if GTE(offset,0#) && EQ(check,ord_c)
then alexIndexInt16OffAddr alex_table offset
else alexIndexInt16OffAddr alex_deflt s
in
case new_s of
-1# -> (new_acc, input__)
-- on an error, we want to keep the input *before* the
-- character that failed, not after.
_ -> alex_scan_tkn user__ orig_input (if c < 0x80 || c >= 0xC0 then (len +# 1#) else len)
-- note that the length is increased ONLY if this is the 1st byte in a char encoding)
new_input new_s new_acc
}
where
check_accs (AlexAccNone) = last_acc
check_accs (AlexAcc a ) = AlexLastAcc a input__ (I# (len))
check_accs (AlexAccSkip) = AlexLastSkip input__ (I# (len))
data AlexLastAcc
= AlexNone
| AlexLastAcc !Int !AlexInput !Int
| AlexLastSkip !AlexInput !Int
data AlexAcc user
= AlexAccNone
| AlexAcc Int
| AlexAccSkip
alex-3.2.5/data/AlexWrapper-basic 0000644 0000000 0000000 00000007420 07346545000 015013 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/wrappers.hs" #-}
-- -----------------------------------------------------------------------------
-- Alex wrapper code.
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
import Data.Word (Word8)
import Data.Char (ord)
import qualified Data.Bits
-- | Encode a Haskell String to a list of Word8 values, in UTF8 format.
utf8Encode :: Char -> [Word8]
utf8Encode = uncurry (:) . utf8Encode'
utf8Encode' :: Char -> (Word8, [Word8])
utf8Encode' c = case go (ord c) of
(x, xs) -> (fromIntegral x, map fromIntegral xs)
where
go oc
| oc <= 0x7f = ( oc
, [
])
| oc <= 0x7ff = ( 0xc0 + (oc `Data.Bits.shiftR` 6)
, [0x80 + oc Data.Bits..&. 0x3f
])
| oc <= 0xffff = ( 0xe0 + (oc `Data.Bits.shiftR` 12)
, [0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
| otherwise = ( 0xf0 + (oc `Data.Bits.shiftR` 18)
, [0x80 + ((oc `Data.Bits.shiftR` 12) Data.Bits..&. 0x3f)
, 0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
type Byte = Word8
-- -----------------------------------------------------------------------------
-- The input type
-- -----------------------------------------------------------------------------
-- Token positions
-- `Posn' records the location of a token in the input text. It has three
-- fields: the address (number of chacaters preceding the token), line number
-- and column of a token within the file. `start_pos' gives the position of the
-- start of the file and `eof_pos' a standard encoding for the end of file.
-- `move_pos' calculates the new position after traversing a given character,
-- assuming the usual eight character tab stops.
-- -----------------------------------------------------------------------------
-- Monad (default and with ByteString input)
-- -----------------------------------------------------------------------------
-- Basic wrapper
type AlexInput = (Char,[Byte],String)
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (c,_,_) = c
-- alexScanTokens :: String -> [token]
alexScanTokens str = go ('\n',[],str)
where go inp__@(_,_bs,s) =
case alexScan inp__ 0 of
AlexEOF -> []
AlexError _ -> error "lexical error"
AlexSkip inp__' _ln -> go inp__'
AlexToken inp__' len act -> act (take len s) : go inp__'
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (c,(b:bs),s) = Just (b,(c,bs,s))
alexGetByte (_,[],[]) = Nothing
alexGetByte (_,[],(c:s)) = case utf8Encode' c of
(b, bs) -> Just (b, (c, bs, s))
-- -----------------------------------------------------------------------------
-- Basic wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- Posn wrapper
-- Adds text positions to the basic model.
-- -----------------------------------------------------------------------------
-- Posn wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- GScan wrapper
-- For compatibility with previous versions of Alex, and because we can.
alex-3.2.5/data/AlexWrapper-basic-bytestring 0000644 0000000 0000000 00000006411 07346545000 017202 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/wrappers.hs" #-}
-- -----------------------------------------------------------------------------
-- Alex wrapper code.
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
import Data.Word (Word8)
import Data.Int (Int64)
import qualified Data.Char
import qualified Data.ByteString.Lazy as ByteString
import qualified Data.ByteString.Internal as ByteString (w2c)
type Byte = Word8
-- -----------------------------------------------------------------------------
-- The input type
data AlexInput = AlexInput { alexChar :: {-# UNPACK #-} !Char, -- previous char
alexStr :: !ByteString.ByteString, -- current input string
alexBytePos :: {-# UNPACK #-} !Int64} -- bytes consumed so far
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar = alexChar
alexGetByte (AlexInput {alexStr=cs,alexBytePos=n}) =
case ByteString.uncons cs of
Nothing -> Nothing
Just (c, rest) ->
Just (c, AlexInput {
alexChar = ByteString.w2c c,
alexStr = rest,
alexBytePos = n+1})
-- -----------------------------------------------------------------------------
-- Token positions
-- `Posn' records the location of a token in the input text. It has three
-- fields: the address (number of chacaters preceding the token), line number
-- and column of a token within the file. `start_pos' gives the position of the
-- start of the file and `eof_pos' a standard encoding for the end of file.
-- `move_pos' calculates the new position after traversing a given character,
-- assuming the usual eight character tab stops.
-- -----------------------------------------------------------------------------
-- Monad (default and with ByteString input)
-- -----------------------------------------------------------------------------
-- Basic wrapper
-- -----------------------------------------------------------------------------
-- Basic wrapper, ByteString version
-- alexScanTokens :: ByteString.ByteString -> [token]
alexScanTokens str = go (AlexInput '\n' str 0)
where go inp__ =
case alexScan inp__ 0 of
AlexEOF -> []
AlexError _ -> error "lexical error"
AlexSkip inp__' _len -> go inp__'
AlexToken inp__' _ act ->
let len = alexBytePos inp__' - alexBytePos inp__ in
act (ByteString.take len (alexStr inp__)) : go inp__'
-- -----------------------------------------------------------------------------
-- Posn wrapper
-- Adds text positions to the basic model.
-- -----------------------------------------------------------------------------
-- Posn wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- GScan wrapper
-- For compatibility with previous versions of Alex, and because we can.
alex-3.2.5/data/AlexWrapper-gscan 0000644 0000000 0000000 00000011220 07346545000 015016 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/wrappers.hs" #-}
-- -----------------------------------------------------------------------------
-- Alex wrapper code.
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
import Data.Word (Word8)
import Data.Char (ord)
import qualified Data.Bits
-- | Encode a Haskell String to a list of Word8 values, in UTF8 format.
utf8Encode :: Char -> [Word8]
utf8Encode = uncurry (:) . utf8Encode'
utf8Encode' :: Char -> (Word8, [Word8])
utf8Encode' c = case go (ord c) of
(x, xs) -> (fromIntegral x, map fromIntegral xs)
where
go oc
| oc <= 0x7f = ( oc
, [
])
| oc <= 0x7ff = ( 0xc0 + (oc `Data.Bits.shiftR` 6)
, [0x80 + oc Data.Bits..&. 0x3f
])
| oc <= 0xffff = ( 0xe0 + (oc `Data.Bits.shiftR` 12)
, [0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
| otherwise = ( 0xf0 + (oc `Data.Bits.shiftR` 18)
, [0x80 + ((oc `Data.Bits.shiftR` 12) Data.Bits..&. 0x3f)
, 0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
type Byte = Word8
-- -----------------------------------------------------------------------------
-- The input type
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
[Byte], -- pending bytes on current char
String) -- current input string
ignorePendingBytes :: AlexInput -> AlexInput
ignorePendingBytes (p,c,_ps,s) = (p,c,[],s)
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (_p,c,_bs,_s) = c
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (p,c,(b:bs),s) = Just (b,(p,c,bs,s))
alexGetByte (_,_,[],[]) = Nothing
alexGetByte (p,_,[],(c:s)) = let p' = alexMove p c
in case utf8Encode' c of
(b, bs) -> p' `seq` Just (b, (p', c, bs, s))
-- -----------------------------------------------------------------------------
-- Token positions
-- `Posn' records the location of a token in the input text. It has three
-- fields: the address (number of chacaters preceding the token), line number
-- and column of a token within the file. `start_pos' gives the position of the
-- start of the file and `eof_pos' a standard encoding for the end of file.
-- `move_pos' calculates the new position after traversing a given character,
-- assuming the usual eight character tab stops.
data AlexPosn = AlexPn !Int !Int !Int
deriving (Eq,Show)
alexStartPos :: AlexPosn
alexStartPos = AlexPn 0 1 1
alexMove :: AlexPosn -> Char -> AlexPosn
alexMove (AlexPn a l c) '\t' = AlexPn (a+1) l (c+alex_tab_size-((c-1) `mod` alex_tab_size))
alexMove (AlexPn a l _) '\n' = AlexPn (a+1) (l+1) 1
alexMove (AlexPn a l c) _ = AlexPn (a+1) l (c+1)
-- -----------------------------------------------------------------------------
-- Monad (default and with ByteString input)
-- -----------------------------------------------------------------------------
-- Basic wrapper
-- -----------------------------------------------------------------------------
-- Basic wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- Posn wrapper
-- Adds text positions to the basic model.
-- -----------------------------------------------------------------------------
-- Posn wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- GScan wrapper
-- For compatibility with previous versions of Alex, and because we can.
alexGScan stop__ state__ inp__ =
alex_gscan stop__ alexStartPos '\n' [] inp__ (0,state__)
alex_gscan stop__ p c bs inp__ (sc,state__) =
case alexScan (p,c,bs,inp__) sc of
AlexEOF -> stop__ p c inp__ (sc,state__)
AlexError _ -> stop__ p c inp__ (sc,state__)
AlexSkip (p',c',bs',inp__') _len ->
alex_gscan stop__ p' c' bs' inp__' (sc,state__)
AlexToken (p',c',bs',inp__') len k ->
k p c inp__ len (\scs -> alex_gscan stop__ p' c' bs' inp__' scs) (sc,state__)
alex-3.2.5/data/AlexWrapper-monad 0000644 0000000 0000000 00000016734 07346545000 015040 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/wrappers.hs" #-}
-- -----------------------------------------------------------------------------
-- Alex wrapper code.
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
import Control.Applicative as App (Applicative (..))
import Data.Word (Word8)
import Data.Char (ord)
import qualified Data.Bits
-- | Encode a Haskell String to a list of Word8 values, in UTF8 format.
utf8Encode :: Char -> [Word8]
utf8Encode = uncurry (:) . utf8Encode'
utf8Encode' :: Char -> (Word8, [Word8])
utf8Encode' c = case go (ord c) of
(x, xs) -> (fromIntegral x, map fromIntegral xs)
where
go oc
| oc <= 0x7f = ( oc
, [
])
| oc <= 0x7ff = ( 0xc0 + (oc `Data.Bits.shiftR` 6)
, [0x80 + oc Data.Bits..&. 0x3f
])
| oc <= 0xffff = ( 0xe0 + (oc `Data.Bits.shiftR` 12)
, [0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
| otherwise = ( 0xf0 + (oc `Data.Bits.shiftR` 18)
, [0x80 + ((oc `Data.Bits.shiftR` 12) Data.Bits..&. 0x3f)
, 0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
type Byte = Word8
-- -----------------------------------------------------------------------------
-- The input type
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
[Byte], -- pending bytes on current char
String) -- current input string
ignorePendingBytes :: AlexInput -> AlexInput
ignorePendingBytes (p,c,_ps,s) = (p,c,[],s)
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (_p,c,_bs,_s) = c
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (p,c,(b:bs),s) = Just (b,(p,c,bs,s))
alexGetByte (_,_,[],[]) = Nothing
alexGetByte (p,_,[],(c:s)) = let p' = alexMove p c
in case utf8Encode' c of
(b, bs) -> p' `seq` Just (b, (p', c, bs, s))
-- -----------------------------------------------------------------------------
-- Token positions
-- `Posn' records the location of a token in the input text. It has three
-- fields: the address (number of chacaters preceding the token), line number
-- and column of a token within the file. `start_pos' gives the position of the
-- start of the file and `eof_pos' a standard encoding for the end of file.
-- `move_pos' calculates the new position after traversing a given character,
-- assuming the usual eight character tab stops.
data AlexPosn = AlexPn !Int !Int !Int
deriving (Eq,Show)
alexStartPos :: AlexPosn
alexStartPos = AlexPn 0 1 1
alexMove :: AlexPosn -> Char -> AlexPosn
alexMove (AlexPn a l c) '\t' = AlexPn (a+1) l (c+alex_tab_size-((c-1) `mod` alex_tab_size))
alexMove (AlexPn a l _) '\n' = AlexPn (a+1) (l+1) 1
alexMove (AlexPn a l c) _ = AlexPn (a+1) l (c+1)
-- -----------------------------------------------------------------------------
-- Monad (default and with ByteString input)
data AlexState = AlexState {
alex_pos :: !AlexPosn, -- position at current input location
alex_inp :: String, -- the current input
alex_chr :: !Char, -- the character before the input
alex_bytes :: [Byte],
alex_scd :: !Int -- the current startcode
}
-- Compile with -funbox-strict-fields for best results!
runAlex :: String -> Alex a -> Either String a
runAlex input__ (Alex f)
= case f (AlexState {alex_bytes = [],
alex_pos = alexStartPos,
alex_inp = input__,
alex_chr = '\n',
alex_scd = 0}) of Left msg -> Left msg
Right ( _, a ) -> Right a
newtype Alex a = Alex { unAlex :: AlexState -> Either String (AlexState, a) }
instance Functor Alex where
fmap f a = Alex $ \s -> case unAlex a s of
Left msg -> Left msg
Right (s', a') -> Right (s', f a')
instance Applicative Alex where
pure a = Alex $ \s -> Right (s, a)
fa <*> a = Alex $ \s -> case unAlex fa s of
Left msg -> Left msg
Right (s', f) -> case unAlex a s' of
Left msg -> Left msg
Right (s'', b) -> Right (s'', f b)
instance Monad Alex where
m >>= k = Alex $ \s -> case unAlex m s of
Left msg -> Left msg
Right (s',a) -> unAlex (k a) s'
return = App.pure
alexGetInput :: Alex AlexInput
alexGetInput
= Alex $ \s@AlexState{alex_pos=pos,alex_chr=c,alex_bytes=bs,alex_inp=inp__} ->
Right (s, (pos,c,bs,inp__))
alexSetInput :: AlexInput -> Alex ()
alexSetInput (pos,c,bs,inp__)
= Alex $ \s -> case s{alex_pos=pos,alex_chr=c,alex_bytes=bs,alex_inp=inp__} of
state__@(AlexState{}) -> Right (state__, ())
alexError :: String -> Alex a
alexError message = Alex $ const $ Left message
alexGetStartCode :: Alex Int
alexGetStartCode = Alex $ \s@AlexState{alex_scd=sc} -> Right (s, sc)
alexSetStartCode :: Int -> Alex ()
alexSetStartCode sc = Alex $ \s -> Right (s{alex_scd=sc}, ())
alexMonadScan = do
inp__ <- alexGetInput
sc <- alexGetStartCode
case alexScan inp__ sc of
AlexEOF -> alexEOF
AlexError ((AlexPn _ line column),_,_,_) -> alexError $ "lexical error at line " ++ (show line) ++ ", column " ++ (show column)
AlexSkip inp__' _len -> do
alexSetInput inp__'
alexMonadScan
AlexToken inp__' len action -> do
alexSetInput inp__'
action (ignorePendingBytes inp__) len
-- -----------------------------------------------------------------------------
-- Useful token actions
type AlexAction result = AlexInput -> Int -> Alex result
-- just ignore this token and scan another one
-- skip :: AlexAction result
skip _input _len = alexMonadScan
-- ignore this token, but set the start code to a new value
-- begin :: Int -> AlexAction result
begin code _input _len = do alexSetStartCode code; alexMonadScan
-- perform an action for this token, and set the start code to a new value
andBegin :: AlexAction result -> Int -> AlexAction result
(action `andBegin` code) input__ len = do
alexSetStartCode code
action input__ len
token :: (AlexInput -> Int -> token) -> AlexAction token
token t input__ len = return (t input__ len)
-- -----------------------------------------------------------------------------
-- Basic wrapper
-- -----------------------------------------------------------------------------
-- Basic wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- Posn wrapper
-- Adds text positions to the basic model.
-- -----------------------------------------------------------------------------
-- Posn wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- GScan wrapper
-- For compatibility with previous versions of Alex, and because we can.
alex-3.2.5/data/AlexWrapper-monad-bytestring 0000644 0000000 0000000 00000015440 07346545000 017221 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/wrappers.hs" #-}
-- -----------------------------------------------------------------------------
-- Alex wrapper code.
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
import Control.Applicative as App (Applicative (..))
import Data.Word (Word8)
import Data.Int (Int64)
import qualified Data.Char
import qualified Data.ByteString.Lazy as ByteString
import qualified Data.ByteString.Internal as ByteString (w2c)
type Byte = Word8
-- -----------------------------------------------------------------------------
-- The input type
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
ByteString.ByteString, -- current input string
Int64) -- bytes consumed so far
ignorePendingBytes :: AlexInput -> AlexInput
ignorePendingBytes i = i -- no pending bytes when lexing bytestrings
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (_,c,_,_) = c
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (p,_,cs,n) =
case ByteString.uncons cs of
Nothing -> Nothing
Just (b, cs') ->
let c = ByteString.w2c b
p' = alexMove p c
n' = n+1
in p' `seq` cs' `seq` n' `seq` Just (b, (p', c, cs',n'))
-- -----------------------------------------------------------------------------
-- Token positions
-- `Posn' records the location of a token in the input text. It has three
-- fields: the address (number of chacaters preceding the token), line number
-- and column of a token within the file. `start_pos' gives the position of the
-- start of the file and `eof_pos' a standard encoding for the end of file.
-- `move_pos' calculates the new position after traversing a given character,
-- assuming the usual eight character tab stops.
data AlexPosn = AlexPn !Int !Int !Int
deriving (Eq,Show)
alexStartPos :: AlexPosn
alexStartPos = AlexPn 0 1 1
alexMove :: AlexPosn -> Char -> AlexPosn
alexMove (AlexPn a l c) '\t' = AlexPn (a+1) l (c+alex_tab_size-((c-1) `mod` alex_tab_size))
alexMove (AlexPn a l _) '\n' = AlexPn (a+1) (l+1) 1
alexMove (AlexPn a l c) _ = AlexPn (a+1) l (c+1)
-- -----------------------------------------------------------------------------
-- Monad (default and with ByteString input)
data AlexState = AlexState {
alex_pos :: !AlexPosn, -- position at current input location
alex_bpos:: !Int64, -- bytes consumed so far
alex_inp :: ByteString.ByteString, -- the current input
alex_chr :: !Char, -- the character before the input
alex_scd :: !Int -- the current startcode
}
-- Compile with -funbox-strict-fields for best results!
runAlex :: ByteString.ByteString -> Alex a -> Either String a
runAlex input__ (Alex f)
= case f (AlexState {alex_bpos = 0,
alex_pos = alexStartPos,
alex_inp = input__,
alex_chr = '\n',
alex_scd = 0}) of Left msg -> Left msg
Right ( _, a ) -> Right a
newtype Alex a = Alex { unAlex :: AlexState -> Either String (AlexState, a) }
instance Functor Alex where
fmap f a = Alex $ \s -> case unAlex a s of
Left msg -> Left msg
Right (s', a') -> Right (s', f a')
instance Applicative Alex where
pure a = Alex $ \s -> Right (s, a)
fa <*> a = Alex $ \s -> case unAlex fa s of
Left msg -> Left msg
Right (s', f) -> case unAlex a s' of
Left msg -> Left msg
Right (s'', b) -> Right (s'', f b)
instance Monad Alex where
m >>= k = Alex $ \s -> case unAlex m s of
Left msg -> Left msg
Right (s',a) -> unAlex (k a) s'
return = App.pure
alexGetInput :: Alex AlexInput
alexGetInput
= Alex $ \s@AlexState{alex_pos=pos,alex_bpos=bpos,alex_chr=c,alex_inp=inp__} ->
Right (s, (pos,c,inp__,bpos))
alexSetInput :: AlexInput -> Alex ()
alexSetInput (pos,c,inp__,bpos)
= Alex $ \s -> case s{alex_pos=pos,
alex_bpos=bpos,
alex_chr=c,
alex_inp=inp__} of
state__@(AlexState{}) -> Right (state__, ())
alexError :: String -> Alex a
alexError message = Alex $ const $ Left message
alexGetStartCode :: Alex Int
alexGetStartCode = Alex $ \s@AlexState{alex_scd=sc} -> Right (s, sc)
alexSetStartCode :: Int -> Alex ()
alexSetStartCode sc = Alex $ \s -> Right (s{alex_scd=sc}, ())
alexMonadScan = do
inp__@(_,_,_,n) <- alexGetInput
sc <- alexGetStartCode
case alexScan inp__ sc of
AlexEOF -> alexEOF
AlexError ((AlexPn _ line column),_,_,_) -> alexError $ "lexical error at line " ++ (show line) ++ ", column " ++ (show column)
AlexSkip inp__' _len -> do
alexSetInput inp__'
alexMonadScan
AlexToken inp__'@(_,_,_,n') _ action -> let len = n'-n in do
alexSetInput inp__'
action (ignorePendingBytes inp__) len
-- -----------------------------------------------------------------------------
-- Useful token actions
type AlexAction result = AlexInput -> Int64 -> Alex result
-- just ignore this token and scan another one
-- skip :: AlexAction result
skip _input _len = alexMonadScan
-- ignore this token, but set the start code to a new value
-- begin :: Int -> AlexAction result
begin code _input _len = do alexSetStartCode code; alexMonadScan
-- perform an action for this token, and set the start code to a new value
andBegin :: AlexAction result -> Int -> AlexAction result
(action `andBegin` code) input__ len = do
alexSetStartCode code
action input__ len
token :: (AlexInput -> Int64 -> token) -> AlexAction token
token t input__ len = return (t input__ len)
-- -----------------------------------------------------------------------------
-- Basic wrapper
-- -----------------------------------------------------------------------------
-- Basic wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- Posn wrapper
-- Adds text positions to the basic model.
-- -----------------------------------------------------------------------------
-- Posn wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- GScan wrapper
-- For compatibility with previous versions of Alex, and because we can.
alex-3.2.5/data/AlexWrapper-monadUserState 0000644 0000000 0000000 00000017473 07346545000 016701 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/wrappers.hs" #-}
-- -----------------------------------------------------------------------------
-- Alex wrapper code.
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
import Control.Applicative as App (Applicative (..))
import Data.Word (Word8)
import Data.Char (ord)
import qualified Data.Bits
-- | Encode a Haskell String to a list of Word8 values, in UTF8 format.
utf8Encode :: Char -> [Word8]
utf8Encode = uncurry (:) . utf8Encode'
utf8Encode' :: Char -> (Word8, [Word8])
utf8Encode' c = case go (ord c) of
(x, xs) -> (fromIntegral x, map fromIntegral xs)
where
go oc
| oc <= 0x7f = ( oc
, [
])
| oc <= 0x7ff = ( 0xc0 + (oc `Data.Bits.shiftR` 6)
, [0x80 + oc Data.Bits..&. 0x3f
])
| oc <= 0xffff = ( 0xe0 + (oc `Data.Bits.shiftR` 12)
, [0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
| otherwise = ( 0xf0 + (oc `Data.Bits.shiftR` 18)
, [0x80 + ((oc `Data.Bits.shiftR` 12) Data.Bits..&. 0x3f)
, 0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
type Byte = Word8
-- -----------------------------------------------------------------------------
-- The input type
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
[Byte], -- pending bytes on current char
String) -- current input string
ignorePendingBytes :: AlexInput -> AlexInput
ignorePendingBytes (p,c,_ps,s) = (p,c,[],s)
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (_p,c,_bs,_s) = c
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (p,c,(b:bs),s) = Just (b,(p,c,bs,s))
alexGetByte (_,_,[],[]) = Nothing
alexGetByte (p,_,[],(c:s)) = let p' = alexMove p c
in case utf8Encode' c of
(b, bs) -> p' `seq` Just (b, (p', c, bs, s))
-- -----------------------------------------------------------------------------
-- Token positions
-- `Posn' records the location of a token in the input text. It has three
-- fields: the address (number of chacaters preceding the token), line number
-- and column of a token within the file. `start_pos' gives the position of the
-- start of the file and `eof_pos' a standard encoding for the end of file.
-- `move_pos' calculates the new position after traversing a given character,
-- assuming the usual eight character tab stops.
data AlexPosn = AlexPn !Int !Int !Int
deriving (Eq,Show)
alexStartPos :: AlexPosn
alexStartPos = AlexPn 0 1 1
alexMove :: AlexPosn -> Char -> AlexPosn
alexMove (AlexPn a l c) '\t' = AlexPn (a+1) l (c+alex_tab_size-((c-1) `mod` alex_tab_size))
alexMove (AlexPn a l _) '\n' = AlexPn (a+1) (l+1) 1
alexMove (AlexPn a l c) _ = AlexPn (a+1) l (c+1)
-- -----------------------------------------------------------------------------
-- Monad (default and with ByteString input)
data AlexState = AlexState {
alex_pos :: !AlexPosn, -- position at current input location
alex_inp :: String, -- the current input
alex_chr :: !Char, -- the character before the input
alex_bytes :: [Byte],
alex_scd :: !Int -- the current startcode
, alex_ust :: AlexUserState -- AlexUserState will be defined in the user program
}
-- Compile with -funbox-strict-fields for best results!
runAlex :: String -> Alex a -> Either String a
runAlex input__ (Alex f)
= case f (AlexState {alex_bytes = [],
alex_pos = alexStartPos,
alex_inp = input__,
alex_chr = '\n',
alex_ust = alexInitUserState,
alex_scd = 0}) of Left msg -> Left msg
Right ( _, a ) -> Right a
newtype Alex a = Alex { unAlex :: AlexState -> Either String (AlexState, a) }
instance Functor Alex where
fmap f a = Alex $ \s -> case unAlex a s of
Left msg -> Left msg
Right (s', a') -> Right (s', f a')
instance Applicative Alex where
pure a = Alex $ \s -> Right (s, a)
fa <*> a = Alex $ \s -> case unAlex fa s of
Left msg -> Left msg
Right (s', f) -> case unAlex a s' of
Left msg -> Left msg
Right (s'', b) -> Right (s'', f b)
instance Monad Alex where
m >>= k = Alex $ \s -> case unAlex m s of
Left msg -> Left msg
Right (s',a) -> unAlex (k a) s'
return = App.pure
alexGetInput :: Alex AlexInput
alexGetInput
= Alex $ \s@AlexState{alex_pos=pos,alex_chr=c,alex_bytes=bs,alex_inp=inp__} ->
Right (s, (pos,c,bs,inp__))
alexSetInput :: AlexInput -> Alex ()
alexSetInput (pos,c,bs,inp__)
= Alex $ \s -> case s{alex_pos=pos,alex_chr=c,alex_bytes=bs,alex_inp=inp__} of
state__@(AlexState{}) -> Right (state__, ())
alexError :: String -> Alex a
alexError message = Alex $ const $ Left message
alexGetStartCode :: Alex Int
alexGetStartCode = Alex $ \s@AlexState{alex_scd=sc} -> Right (s, sc)
alexSetStartCode :: Int -> Alex ()
alexSetStartCode sc = Alex $ \s -> Right (s{alex_scd=sc}, ())
alexGetUserState :: Alex AlexUserState
alexGetUserState = Alex $ \s@AlexState{alex_ust=ust} -> Right (s,ust)
alexSetUserState :: AlexUserState -> Alex ()
alexSetUserState ss = Alex $ \s -> Right (s{alex_ust=ss}, ())
alexMonadScan = do
inp__ <- alexGetInput
sc <- alexGetStartCode
case alexScan inp__ sc of
AlexEOF -> alexEOF
AlexError ((AlexPn _ line column),_,_,_) -> alexError $ "lexical error at line " ++ (show line) ++ ", column " ++ (show column)
AlexSkip inp__' _len -> do
alexSetInput inp__'
alexMonadScan
AlexToken inp__' len action -> do
alexSetInput inp__'
action (ignorePendingBytes inp__) len
-- -----------------------------------------------------------------------------
-- Useful token actions
type AlexAction result = AlexInput -> Int -> Alex result
-- just ignore this token and scan another one
-- skip :: AlexAction result
skip _input _len = alexMonadScan
-- ignore this token, but set the start code to a new value
-- begin :: Int -> AlexAction result
begin code _input _len = do alexSetStartCode code; alexMonadScan
-- perform an action for this token, and set the start code to a new value
andBegin :: AlexAction result -> Int -> AlexAction result
(action `andBegin` code) input__ len = do
alexSetStartCode code
action input__ len
token :: (AlexInput -> Int -> token) -> AlexAction token
token t input__ len = return (t input__ len)
-- -----------------------------------------------------------------------------
-- Basic wrapper
-- -----------------------------------------------------------------------------
-- Basic wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- Posn wrapper
-- Adds text positions to the basic model.
-- -----------------------------------------------------------------------------
-- Posn wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- GScan wrapper
-- For compatibility with previous versions of Alex, and because we can.
alex-3.2.5/data/AlexWrapper-monadUserState-bytestring 0000644 0000000 0000000 00000015653 07346545000 021067 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/wrappers.hs" #-}
-- -----------------------------------------------------------------------------
-- Alex wrapper code.
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
import Control.Applicative as App (Applicative (..))
import Data.Word (Word8)
import Data.Int (Int64)
import qualified Data.Char
import qualified Data.ByteString.Lazy as ByteString
import qualified Data.ByteString.Internal as ByteString (w2c)
type Byte = Word8
-- -----------------------------------------------------------------------------
-- The input type
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
ByteString.ByteString, -- current input string
Int64) -- bytes consumed so far
ignorePendingBytes :: AlexInput -> AlexInput
ignorePendingBytes i = i -- no pending bytes when lexing bytestrings
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (_,c,_,_) = c
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (p,_,cs,n) =
case ByteString.uncons cs of
Nothing -> Nothing
Just (b, cs') ->
let c = ByteString.w2c b
p' = alexMove p c
n' = n+1
in p' `seq` cs' `seq` n' `seq` Just (b, (p', c, cs',n'))
-- -----------------------------------------------------------------------------
-- Token positions
-- `Posn' records the location of a token in the input text. It has three
-- fields: the address (number of chacaters preceding the token), line number
-- and column of a token within the file. `start_pos' gives the position of the
-- start of the file and `eof_pos' a standard encoding for the end of file.
-- `move_pos' calculates the new position after traversing a given character,
-- assuming the usual eight character tab stops.
data AlexPosn = AlexPn !Int !Int !Int
deriving (Eq,Show)
alexStartPos :: AlexPosn
alexStartPos = AlexPn 0 1 1
alexMove :: AlexPosn -> Char -> AlexPosn
alexMove (AlexPn a l c) '\t' = AlexPn (a+1) l (c+alex_tab_size-((c-1) `mod` alex_tab_size))
alexMove (AlexPn a l _) '\n' = AlexPn (a+1) (l+1) 1
alexMove (AlexPn a l c) _ = AlexPn (a+1) l (c+1)
-- -----------------------------------------------------------------------------
-- Monad (default and with ByteString input)
data AlexState = AlexState {
alex_pos :: !AlexPosn, -- position at current input location
alex_bpos:: !Int64, -- bytes consumed so far
alex_inp :: ByteString.ByteString, -- the current input
alex_chr :: !Char, -- the character before the input
alex_scd :: !Int -- the current startcode
, alex_ust :: AlexUserState -- AlexUserState will be defined in the user program
}
-- Compile with -funbox-strict-fields for best results!
runAlex :: ByteString.ByteString -> Alex a -> Either String a
runAlex input__ (Alex f)
= case f (AlexState {alex_bpos = 0,
alex_pos = alexStartPos,
alex_inp = input__,
alex_chr = '\n',
alex_ust = alexInitUserState,
alex_scd = 0}) of Left msg -> Left msg
Right ( _, a ) -> Right a
newtype Alex a = Alex { unAlex :: AlexState -> Either String (AlexState, a) }
instance Functor Alex where
fmap f a = Alex $ \s -> case unAlex a s of
Left msg -> Left msg
Right (s', a') -> Right (s', f a')
instance Applicative Alex where
pure a = Alex $ \s -> Right (s, a)
fa <*> a = Alex $ \s -> case unAlex fa s of
Left msg -> Left msg
Right (s', f) -> case unAlex a s' of
Left msg -> Left msg
Right (s'', b) -> Right (s'', f b)
instance Monad Alex where
m >>= k = Alex $ \s -> case unAlex m s of
Left msg -> Left msg
Right (s',a) -> unAlex (k a) s'
return = App.pure
alexGetInput :: Alex AlexInput
alexGetInput
= Alex $ \s@AlexState{alex_pos=pos,alex_bpos=bpos,alex_chr=c,alex_inp=inp__} ->
Right (s, (pos,c,inp__,bpos))
alexSetInput :: AlexInput -> Alex ()
alexSetInput (pos,c,inp__,bpos)
= Alex $ \s -> case s{alex_pos=pos,
alex_bpos=bpos,
alex_chr=c,
alex_inp=inp__} of
state__@(AlexState{}) -> Right (state__, ())
alexError :: String -> Alex a
alexError message = Alex $ const $ Left message
alexGetStartCode :: Alex Int
alexGetStartCode = Alex $ \s@AlexState{alex_scd=sc} -> Right (s, sc)
alexSetStartCode :: Int -> Alex ()
alexSetStartCode sc = Alex $ \s -> Right (s{alex_scd=sc}, ())
alexMonadScan = do
inp__@(_,_,_,n) <- alexGetInput
sc <- alexGetStartCode
case alexScan inp__ sc of
AlexEOF -> alexEOF
AlexError ((AlexPn _ line column),_,_,_) -> alexError $ "lexical error at line " ++ (show line) ++ ", column " ++ (show column)
AlexSkip inp__' _len -> do
alexSetInput inp__'
alexMonadScan
AlexToken inp__'@(_,_,_,n') _ action -> let len = n'-n in do
alexSetInput inp__'
action (ignorePendingBytes inp__) len
-- -----------------------------------------------------------------------------
-- Useful token actions
type AlexAction result = AlexInput -> Int64 -> Alex result
-- just ignore this token and scan another one
-- skip :: AlexAction result
skip _input _len = alexMonadScan
-- ignore this token, but set the start code to a new value
-- begin :: Int -> AlexAction result
begin code _input _len = do alexSetStartCode code; alexMonadScan
-- perform an action for this token, and set the start code to a new value
andBegin :: AlexAction result -> Int -> AlexAction result
(action `andBegin` code) input__ len = do
alexSetStartCode code
action input__ len
token :: (AlexInput -> Int64 -> token) -> AlexAction token
token t input__ len = return (t input__ len)
-- -----------------------------------------------------------------------------
-- Basic wrapper
-- -----------------------------------------------------------------------------
-- Basic wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- Posn wrapper
-- Adds text positions to the basic model.
-- -----------------------------------------------------------------------------
-- Posn wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- GScan wrapper
-- For compatibility with previous versions of Alex, and because we can.
alex-3.2.5/data/AlexWrapper-posn 0000644 0000000 0000000 00000011111 07346545000 014701 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/wrappers.hs" #-}
-- -----------------------------------------------------------------------------
-- Alex wrapper code.
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
import Data.Word (Word8)
import Data.Char (ord)
import qualified Data.Bits
-- | Encode a Haskell String to a list of Word8 values, in UTF8 format.
utf8Encode :: Char -> [Word8]
utf8Encode = uncurry (:) . utf8Encode'
utf8Encode' :: Char -> (Word8, [Word8])
utf8Encode' c = case go (ord c) of
(x, xs) -> (fromIntegral x, map fromIntegral xs)
where
go oc
| oc <= 0x7f = ( oc
, [
])
| oc <= 0x7ff = ( 0xc0 + (oc `Data.Bits.shiftR` 6)
, [0x80 + oc Data.Bits..&. 0x3f
])
| oc <= 0xffff = ( 0xe0 + (oc `Data.Bits.shiftR` 12)
, [0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
| otherwise = ( 0xf0 + (oc `Data.Bits.shiftR` 18)
, [0x80 + ((oc `Data.Bits.shiftR` 12) Data.Bits..&. 0x3f)
, 0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
type Byte = Word8
-- -----------------------------------------------------------------------------
-- The input type
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
[Byte], -- pending bytes on current char
String) -- current input string
ignorePendingBytes :: AlexInput -> AlexInput
ignorePendingBytes (p,c,_ps,s) = (p,c,[],s)
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (_p,c,_bs,_s) = c
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (p,c,(b:bs),s) = Just (b,(p,c,bs,s))
alexGetByte (_,_,[],[]) = Nothing
alexGetByte (p,_,[],(c:s)) = let p' = alexMove p c
in case utf8Encode' c of
(b, bs) -> p' `seq` Just (b, (p', c, bs, s))
-- -----------------------------------------------------------------------------
-- Token positions
-- `Posn' records the location of a token in the input text. It has three
-- fields: the address (number of chacaters preceding the token), line number
-- and column of a token within the file. `start_pos' gives the position of the
-- start of the file and `eof_pos' a standard encoding for the end of file.
-- `move_pos' calculates the new position after traversing a given character,
-- assuming the usual eight character tab stops.
data AlexPosn = AlexPn !Int !Int !Int
deriving (Eq,Show)
alexStartPos :: AlexPosn
alexStartPos = AlexPn 0 1 1
alexMove :: AlexPosn -> Char -> AlexPosn
alexMove (AlexPn a l c) '\t' = AlexPn (a+1) l (c+alex_tab_size-((c-1) `mod` alex_tab_size))
alexMove (AlexPn a l _) '\n' = AlexPn (a+1) (l+1) 1
alexMove (AlexPn a l c) _ = AlexPn (a+1) l (c+1)
-- -----------------------------------------------------------------------------
-- Monad (default and with ByteString input)
-- -----------------------------------------------------------------------------
-- Basic wrapper
-- -----------------------------------------------------------------------------
-- Basic wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- Posn wrapper
-- Adds text positions to the basic model.
--alexScanTokens :: String -> [token]
alexScanTokens str0 = go (alexStartPos,'\n',[],str0)
where go inp__@(pos,_,_,str) =
case alexScan inp__ 0 of
AlexEOF -> []
AlexError ((AlexPn _ line column),_,_,_) -> error $ "lexical error at line " ++ (show line) ++ ", column " ++ (show column)
AlexSkip inp__' _ln -> go inp__'
AlexToken inp__' len act -> act pos (take len str) : go inp__'
-- -----------------------------------------------------------------------------
-- Posn wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- GScan wrapper
-- For compatibility with previous versions of Alex, and because we can.
alex-3.2.5/data/AlexWrapper-posn-bytestring 0000644 0000000 0000000 00000007432 07346545000 017104 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/wrappers.hs" #-}
-- -----------------------------------------------------------------------------
-- Alex wrapper code.
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
import Data.Word (Word8)
import Data.Int (Int64)
import qualified Data.Char
import qualified Data.ByteString.Lazy as ByteString
import qualified Data.ByteString.Internal as ByteString (w2c)
type Byte = Word8
-- -----------------------------------------------------------------------------
-- The input type
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
ByteString.ByteString, -- current input string
Int64) -- bytes consumed so far
ignorePendingBytes :: AlexInput -> AlexInput
ignorePendingBytes i = i -- no pending bytes when lexing bytestrings
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (_,c,_,_) = c
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (p,_,cs,n) =
case ByteString.uncons cs of
Nothing -> Nothing
Just (b, cs') ->
let c = ByteString.w2c b
p' = alexMove p c
n' = n+1
in p' `seq` cs' `seq` n' `seq` Just (b, (p', c, cs',n'))
-- -----------------------------------------------------------------------------
-- Token positions
-- `Posn' records the location of a token in the input text. It has three
-- fields: the address (number of chacaters preceding the token), line number
-- and column of a token within the file. `start_pos' gives the position of the
-- start of the file and `eof_pos' a standard encoding for the end of file.
-- `move_pos' calculates the new position after traversing a given character,
-- assuming the usual eight character tab stops.
data AlexPosn = AlexPn !Int !Int !Int
deriving (Eq,Show)
alexStartPos :: AlexPosn
alexStartPos = AlexPn 0 1 1
alexMove :: AlexPosn -> Char -> AlexPosn
alexMove (AlexPn a l c) '\t' = AlexPn (a+1) l (c+alex_tab_size-((c-1) `mod` alex_tab_size))
alexMove (AlexPn a l _) '\n' = AlexPn (a+1) (l+1) 1
alexMove (AlexPn a l c) _ = AlexPn (a+1) l (c+1)
-- -----------------------------------------------------------------------------
-- Monad (default and with ByteString input)
-- -----------------------------------------------------------------------------
-- Basic wrapper
-- -----------------------------------------------------------------------------
-- Basic wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- Posn wrapper
-- Adds text positions to the basic model.
-- -----------------------------------------------------------------------------
-- Posn wrapper, ByteString version
--alexScanTokens :: ByteString.ByteString -> [token]
alexScanTokens str0 = go (alexStartPos,'\n',str0,0)
where go inp__@(pos,_,str,n) =
case alexScan inp__ 0 of
AlexEOF -> []
AlexError ((AlexPn _ line column),_,_,_) -> error $ "lexical error at line " ++ (show line) ++ ", column " ++ (show column)
AlexSkip inp__' _len -> go inp__'
AlexToken inp__'@(_,_,_,n') _ act ->
act pos (ByteString.take (n'-n) str) : go inp__'
-- -----------------------------------------------------------------------------
-- GScan wrapper
-- For compatibility with previous versions of Alex, and because we can.
alex-3.2.5/data/AlexWrapper-strict-bytestring 0000644 0000000 0000000 00000006370 07346545000 017435 0 ustar 00 0000000 0000000 {-# LINE 1 "templates/wrappers.hs" #-}
-- -----------------------------------------------------------------------------
-- Alex wrapper code.
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
import Data.Word (Word8)
import qualified Data.Char
import qualified Data.ByteString as ByteString
import qualified Data.ByteString.Internal as ByteString hiding (ByteString)
import qualified Data.ByteString.Unsafe as ByteString
type Byte = Word8
-- -----------------------------------------------------------------------------
-- The input type
data AlexInput = AlexInput { alexChar :: {-# UNPACK #-} !Char,
alexStr :: {-# UNPACK #-} !ByteString.ByteString,
alexBytePos :: {-# UNPACK #-} !Int}
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar = alexChar
alexGetByte (AlexInput {alexStr=cs,alexBytePos=n}) =
case ByteString.uncons cs of
Nothing -> Nothing
Just (c, rest) ->
Just (c, AlexInput {
alexChar = ByteString.w2c c,
alexStr = rest,
alexBytePos = n+1})
-- -----------------------------------------------------------------------------
-- Token positions
-- `Posn' records the location of a token in the input text. It has three
-- fields: the address (number of chacaters preceding the token), line number
-- and column of a token within the file. `start_pos' gives the position of the
-- start of the file and `eof_pos' a standard encoding for the end of file.
-- `move_pos' calculates the new position after traversing a given character,
-- assuming the usual eight character tab stops.
-- -----------------------------------------------------------------------------
-- Monad (default and with ByteString input)
-- -----------------------------------------------------------------------------
-- Basic wrapper
-- -----------------------------------------------------------------------------
-- Basic wrapper, ByteString version
-- alexScanTokens :: ByteString.ByteString -> [token]
alexScanTokens str = go (AlexInput '\n' str 0)
where go inp__ =
case alexScan inp__ 0 of
AlexEOF -> []
AlexError _ -> error "lexical error"
AlexSkip inp__' _len -> go inp__'
AlexToken inp__' _ act ->
let len = alexBytePos inp__' - alexBytePos inp__ in
act (ByteString.take len (alexStr inp__)) : go inp__'
-- -----------------------------------------------------------------------------
-- Posn wrapper
-- Adds text positions to the basic model.
-- -----------------------------------------------------------------------------
-- Posn wrapper, ByteString version
-- -----------------------------------------------------------------------------
-- GScan wrapper
-- For compatibility with previous versions of Alex, and because we can.
alex-3.2.5/doc/ 0000755 0000000 0000000 00000000000 07346545000 011410 5 ustar 00 0000000 0000000 alex-3.2.5/doc/Makefile 0000755 0000000 0000000 00000000121 07346545000 013045 0 ustar 00 0000000 0000000 include config.mk
XML_DOC = alex
INSTALL_XML_DOC = alex
include docbook-xml.mk
alex-3.2.5/doc/aclocal.m4 0000755 0000000 0000000 00000012614 07346545000 013257 0 ustar 00 0000000 0000000 # FP_GEN_DOCBOOK_XML
# ------------------
# Generates a DocBook XML V4.2 document in conftest.xml.
AC_DEFUN([FP_GEN_DOCBOOK_XML],
[rm -f conftest.xml
cat > conftest.xml << EOF
A DocBook Test DocumentA Chapter TitleThis is a paragraph, referencing .Another Chapter TitleThis is another paragraph, referencing .
EOF
]) # FP_GEN_DOCBOOK_XML
# FP_PROG_XSLTPROC
# ----------------
# Sets the output variable XsltprocCmd to the full path of the XSLT processor
# xsltproc. XsltprocCmd is empty if xsltproc could not be found.
AC_DEFUN([FP_PROG_XSLTPROC],
[AC_PATH_PROG([XsltprocCmd], [xsltproc])
if test -z "$XsltprocCmd"; then
AC_MSG_WARN([cannot find xsltproc in your PATH, you will not be able to build the documentation])
fi
])# FP_PROG_XSLTPROC
# FP_DIR_DOCBOOK_XSL(XSL-DIRS)
# ----------------------------
# Check which of the directories XSL-DIRS contains DocBook XSL stylesheets. The
# output variable DIR_DOCBOOK_XSL will contain the first usable directory or
# will be empty if none could be found.
AC_DEFUN([FP_DIR_DOCBOOK_XSL],
[AC_REQUIRE([FP_PROG_XSLTPROC])dnl
if test -n "$XsltprocCmd"; then
AC_CACHE_CHECK([for DocBook XSL stylesheet directory], fp_cv_dir_docbook_xsl,
[FP_GEN_DOCBOOK_XML
fp_cv_dir_docbook_xsl=no
for fp_var in $1; do
if $XsltprocCmd ${fp_var}/html/docbook.xsl conftest.xml > /dev/null 2>&1; then
fp_cv_dir_docbook_xsl=$fp_var
break
fi
done
rm -rf conftest*])
fi
if test x"$fp_cv_dir_docbook_xsl" = xno; then
AC_MSG_WARN([cannot find DocBook XSL stylesheets, you will not be able to build the documentation])
DIR_DOCBOOK_XSL=
else
DIR_DOCBOOK_XSL=$fp_cv_dir_docbook_xsl
fi
AC_SUBST([DIR_DOCBOOK_XSL])
])# FP_DIR_DOCBOOK_XSL
# FP_PROG_XMLLINT
# ----------------
# Sets the output variable XmllintCmd to the full path of the XSLT processor
# xmllint. XmllintCmd is empty if xmllint could not be found.
AC_DEFUN([FP_PROG_XMLLINT],
[AC_PATH_PROG([XmllintCmd], [xmllint])
if test -z "$XmllintCmd"; then
AC_MSG_WARN([cannot find xmllint in your PATH, you will not be able to validate your documentation])
fi
])# FP_PROG_XMLLINT
# FP_CHECK_DOCBOOK_DTD
# --------------------
AC_DEFUN([FP_CHECK_DOCBOOK_DTD],
[AC_REQUIRE([FP_PROG_XMLLINT])dnl
if test -n "$XmllintCmd"; then
AC_MSG_CHECKING([for DocBook DTD])
FP_GEN_DOCBOOK_XML
if $XmllintCmd --valid --noout conftest.xml > /dev/null 2>&1; then
AC_MSG_RESULT([ok])
else
AC_MSG_RESULT([failed])
AC_MSG_WARN([cannot find a DTD for DocBook XML V4.2, you will not be able to validate your documentation])
AC_MSG_WARN([check your XML_CATALOG_FILES environment variable and/or /etc/xml/catalog])
fi
rm -rf conftest*
fi
])# FP_CHECK_DOCBOOK_DTD
# FP_GEN_FO
# ------------------
# Generates a formatting objects document in conftest.fo.
AC_DEFUN([FP_GEN_FO],
[rm -f conftest.fo
cat > conftest.fo << EOF
Test!
EOF
]) # FP_GEN_FO
# FP_PROG_FOP
# -----------
# Set the output variable 'FopCmd' to the first working 'fop' in the current
# 'PATH'. Note that /usr/bin/fop is broken in SuSE 9.1 (unpatched), so try
# /usr/share/fop/fop.sh in that case (or no 'fop'), too.
AC_DEFUN([FP_PROG_FOP],
[AC_PATH_PROGS([FopCmd1], [fop])
if test -n "$FopCmd1"; then
AC_CACHE_CHECK([for $FopCmd1 usability], [fp_cv_fop_usability],
[FP_GEN_FO
if "$FopCmd1" -fo conftest.fo -ps conftest.ps > /dev/null 2>&1; then
fp_cv_fop_usability=yes
else
fp_cv_fop_usability=no
fi
rm -rf conftest*])
if test x"$fp_cv_fop_usability" = xyes; then
FopCmd=$FopCmd1
fi
fi
if test -z "$FopCmd"; then
AC_PATH_PROGS([FopCmd2], [fop.sh], , [/usr/share/fop])
FopCmd=$FopCmd2
fi
AC_SUBST([FopCmd])
])# FP_PROG_FOP
# FP_PROG_FO_PROCESSOR
# --------------------
# Try to find an FO processor. PassiveTeX output is sometimes a bit strange, so
# try FOP first. Sets the output variables FopCmd, XmltexCmd, DvipsCmd, and
# PdfxmltexCmd.
AC_DEFUN([FP_PROG_FO_PROCESSOR],
[AC_REQUIRE([FP_PROG_FOP])
AC_PATH_PROG([XmltexCmd], [xmltex])
AC_PATH_PROG([DvipsCmd], [dvips])
if test -z "$FopCmd"; then
if test -z "$XmltexCmd"; then
AC_MSG_WARN([cannot find an FO => DVI converter, you will not be able to build DVI or PostScript documentation])
else
if test -z "$DvipsCmd"; then
AC_MSG_WARN([cannot find a DVI => PS converter, you will not be able to build PostScript documentation])
fi
fi
AC_PATH_PROG([PdfxmltexCmd], [pdfxmltex])
if test -z "$PdfxmltexCmd"; then
AC_MSG_WARN([cannot find an FO => PDF converter, you will not be able to build PDF documentation])
fi
elif test -z "$XmltexCmd"; then
AC_MSG_WARN([cannot find an FO => DVI converter, you will not be able to build DVI documentation])
fi
])# FP_PROG_FO_PROCESSOR
alex-3.2.5/doc/alex.1.in 0000755 0000000 0000000 00000005473 07346545000 013044 0 ustar 00 0000000 0000000 .TH ALEX 1 "2003-09-09" "Glasgow FP Suite" "Alex Lexical Analyser Generator"
.SH NAME
alex \- the lexical analyser generator for Haskell
.SH SYNOPSIS
.B alex
[\fIOPTION\fR]... \fIfile\fR [\fIOPTION\fR]...
.SH DESCRIPTION
This manual page documents briefly the
.BR alex
command.
.PP
This manual page was written for the Debian GNU/Linux distribution
because the original program does not have a manual page. Instead, it
has documentation in various other formats, including DVI, Info and
HTML; see below.
.PP
.B Alex
is a lexical analyser generator system for Haskell. It is similar to the
tool lex or flex for C/C++.
.PP
Input files are expected to be of the form
.I file.x
and
.B alex
will produce output in
.I file.y
.PP
Caveat: When using
.I hbc
(Chalmers Haskell) the command argument structure is slightly
different. This is because the hbc run time system takes some flags
as its own (for setting things like the heap size, etc). This problem
can be circumvented by adding a single dash (`-') to your command
line. So when using a hbc generated version of Alex, the argument
structure is:
.B alex \-
[\fIOPTION\fR]... \fIfile\fR [\fIOPTION\fR]...
.SH OPTIONS
The programs follow the usual GNU command line syntax, with long
options starting with two dashes (`--'). A summary of options is
included below. For a complete description, see the other
documentation.
.TP
.BR \-d ", " \-\-debug
Instructs Alex to generate a lexer which will output debugging messages
as it runs.
.TP
.BR \-g ", " \-\-ghc
Instructs Alex to generate a lexer which is optimised for compiling with
GHC. The lexer will be significantly more efficient, both in terms of
the size of the compiled lexer and its runtime.
.TP
\fB\-o\fR \fIFILE\fR, \fB\-\-outfile=\fIFILE
Specifies the filename in which the output is to be placed. By default,
this is the name of the input file with the
.I .x
suffix replaced by
.I .hs
.TP
\fB\-i\fR [\fIFILE\fR], \fB\-\-info\fR[=\fIFILE\fR]
Produces a human-readable rendition of the state machine (DFA) that
Alex derives from the lexer, in
.I FILE
(default:
.I file.info
where the input file is
.I file.x
).
The format of the info file is currently a bit basic, and not
particularly informative.
.TP
.BR \-v ", " \-\-version
Print version information on standard output then exit successfully.
.SH FILES
.I @DATADIR@
.SH "SEE ALSO"
.BR @DOCDIR@ ,
the Alex homepage
.UR http://haskell.org/alex/
(http://haskell.org/alex/)
.UE
.SH COPYRIGHT
Alex Version @VERSION@
Copyright (c) 1995-2003, Chris Dornan and Simon Marlow
.SH AUTHOR
This manual page was written by Ian Lynagh
, based on the happy manpage, for the Debian GNU/Linux
system (but may be used by others).
.\" Local variables:
.\" mode: nroff
.\" End:
alex-3.2.5/doc/alex.xml 0000755 0000000 0000000 00000214272 07346545000 013076 0 ustar 00 0000000 0000000
2003-8-11Alex User GuideChrisDornanIsaacJonesSimonMarlowijones@syntaxpolice.orgAlex is a tool for generating lexical analysers in
Haskell, given a description of the tokens to be recognised in
the form of regular expressions. It is similar to the tool
lex or flex for C/C++.About AlexAlex can always be obtained from its home page. The latest
source code lives in the git
repository on GitHub.
Release Notes for version 3.0Unicode support (contributed mostly by Jean-Philippe
Bernardy, with help from Alan Zimmerman).
An Alex lexer now takes a UTF-8 encoded byte sequence
as input (see . If you are
using the "basic" wrapper or one of the other wrappers
that takes a Haskell String as
input, the string is automatically encoded into UTF-8
by Alex. If your input is
a ByteString, you are responsible
for ensuring that the input is UTF-8 encoded. The old
8-bit behaviour is still available via
the option.
Alex source files are assumed to be in UTF-8, like
Haskell source files. The lexer specification can use
Unicode characters and ranges.
alexGetChar is renamed to
alexGetByte in the generated code.
There is a new option, , that
restores the old behaviour.
Alex now does DFA minimization, which helps to reduce the
size of the generated tables, especially for lexers that
use Unicode.
Release Notes for version 2.2Cabal 1.2 is now required.ByteString wrappers: use Alex to lex ByteStrings
directly.Release Notes for version 2.1.0Switch to a Cabal build system: you need a recent
version of Cabal (1.1.6 or later). If you have GHC 6.4.2,
then you need to upgrade Cabal before building Alex. GHC
6.6 is fine.Slight change in the error semantics: the input
returned on error is before the erroneous character was
read, not after. This helps to give better error
messages.Release Notes for version 2.0Alex has changed a lot between
versions 1.x and 2.0. The following is supposed to be an
exhaustive list of the changes:Syntax changesCode blocks are now surrounded by
{...} rather than
%{...%}.Character-set macros now begin with
‘$’ instead of
‘^’ and have
multi-character names.Regular expression macros now begin with
‘@’ instead of
‘%’ and have
multi-character names.Macro definitions are no longer surrounded by
{ ... }.Rules are now of the form
<c1,c2,...> regex { code }
where c1, c2 are
startcodes, and code is an arbitrary
Haskell expression.Regular expression syntax changes:() is the empty regular
expression (used to be
‘$’)set complement can now be expressed as
[^sets] (for similarity with lex
regular expressions).The 'abc' form is no longer
available, use [abc]
instead.‘^’ and
‘$’ have the usual
meanings: ‘^’ matches
just after a ‘\n’, and
‘$’ matches just before
a ‘\n’.‘\n’ is now the
escape character, not
‘^’.The form "..." means the same
as the sequence of characters inside the quotes, the
difference being that special characters do not need
to be escaped inside "...".Rules can have arbitrary predicates attached to
them. This subsumes the previous left-context and
right-context facilities (although these are still allowed
as syntactic sugar).Changes in the form of an Alex fileEach file can now only define a single grammar.
This change was made to simplify code generation.
Multiple grammars can be simulated using startcodes, or
split into separate modules.The programmer experience has been simplified, and
at the same time made more flexible. See the for details.You no longer need to import the
Alex module.Usage changesThe command-line syntax is quite different. See .Implementation changesA more efficient table representation, coupled with
standard table-compression techniques, are used to keep
the size of the generated code down.When compiling a grammar with GHC, the -g switch
causes an even faster and smaller grammar to be
generated.Startcodes are implemented in a different way: each
state corresponds to a different initial state in the DFA,
so the scanner doesn't have to check the startcode when it
gets to an accept state. This results in a larger, but
quicker, scanner.Reporting bugs in AlexPlease report bugs in Alex to
simonmar@microsoft.com. There are no specific
mailing lists for the discussion of Alex-related matters, but
such topics should be fine on the Haskell
Cafe mailing list.LicenseCopyright (c) 1995-2011, Chris Dornan and Simon Marlow.
All rights reserved.Redistribution and use in source and binary forms, with or
without modification, are permitted provided that the following
conditions are met:Redistributions of source code must retain the above
copyright notice, this list of conditions and the following
disclaimer.Redistributions in binary form must reproduce the
above copyright notice, this list of conditions and the
following disclaimer in the documentation and/or other
materials provided with the distribution.Neither the name of the copyright holders, nor the
names of the contributors may be used to endorse or promote
products derived from this software without specific prior
written permission.THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND
CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES,
INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR
CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF
USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED
AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING
IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
THE POSSIBILITY OF SUCH DAMAGE.IntroductionAlex is a tool for generating lexical analysers in Haskell,
given a description of the tokens to be recognised in the form of
regular expressions. It is similar to the tools
lex and flex for C/C++.Alex takes a description of tokens based on regular
expressions and generates a Haskell module containing code for
scanning text efficiently. Alex is designed to be familiar to
existing lex users, although it does depart from lex in a number
of ways.A simple Alex specification.{
module Main (main) where
}
%wrapper "basic"
$digit = 0-9 -- digits
$alpha = [a-zA-Z] -- alphabetic characters
tokens :-
$white+ ;
"--".* ;
let { \s -> Let }
in { \s -> In }
$digit+ { \s -> Int (read s) }
[\=\+\-\*\/\(\)] { \s -> Sym (head s) }
$alpha [$alpha $digit \_ \']* { \s -> Var s }
{
-- Each action has type :: String -> Token
-- The token type:
data Token =
Let |
In |
Sym Char |
Var String |
Int Int
deriving (Eq,Show)
main = do
s <- getContents
print (alexScanTokens s)
}A sample specification is given in . The first few lines between the
{ and } provide a code scrap
(some inlined Haskell code) to be placed directly in the output,
the scrap at the top of the module is normally used to declare the
module name for the generated Haskell module, in this case
Main.The next line, %wrapper "basic" controls
what kind of support code Alex should produce along with the basic
scanner. The basic wrapper selects a scanner
that tokenises a String and returns a list of
tokens. Wrappers are described fully in .The next two lines define the $digit and
$alpha macros for use in the token
definitions.The ‘tokens :-’ line ends the
macro definitions and starts the definition of the scanner.The scanner is specified as a series of token definitions
where each token specification takes the form ofregexp { code }The meaning of this rule is "if the input matches
regexp, then return
code". The code part along with the
braces can be replaced by simply
‘;’, meaning that this token should
be ignored in the input stream. As you can see, we've used this
to ignore whitespace in our example.Our scanner is set up so that the actions are all functions
with type String->Token. When the token is
matched, the portion of the input stream that it matched is passed
to the appropriate action function as a
String.At the bottom of the file we have another code fragment,
surrounded by braces { ... }. In this
fragment, we declare the type of the tokens, and give a
main function that we can use for testing it;
the main function just tokenises the input and
prints the results to standard output.Alex has kindly provided the following function which we can
use to invoke the scanner:alexScanTokens :: String -> [Token]Alex arranges for the input stream to be tokenised, each of
the action functions to be passed the appropriate
String, and a list of Tokens
returned as the result. If the input stream is lazy, the output
stream will also be produced lazilythat is, unless you
have any patterns that require a long lookahead..We have demonstrated the simplest form of scanner here,
which was selected by the %wrapper "basic" line
near the top of the file. In general, actions do not have to have
type String->Token, and there's no requirement
for the scanner to return a list of tokens.With this specification in the file
Tokens.x, Alex can be used to generate
Tokens.hs:$ alex Tokens.xIf the module needed to be placed in a different file,
Main.hs for example, then the output filename
can be specified using the option:$ alex Tokens.x -o Main.hsThe resulting module is Haskell 98 compatible. It can also
be readily used with a Happy parser.Alex FilesIn this section we describe the layout of an Alex lexical
specification.
We begin with the lexical syntax; elements of the lexical syntax
are referred to throughout the rest of this documentation, so
you may need to refer back to the following section several
times.
Lexical syntaxAlex's lexical syntax is given below. It is written as a
set of macro definitions using Alex's own syntax. These macros
are used in the BNF specification of the syntax later on.$digit = [0-9]
$octdig = [0-7]
$hexdig = [0-9A-Fa-f]
$special = [\.\;\,\$\|\*\+\?\#\~\-\{\}\(\)\[\]\^\/]
$graphic = $printable # $white
@string = \" ($graphic # \")* \"
@id = [A-Za-z][A-Za-z'_]*
@smac = '$' id
@rmac = '@' id
@char = ($graphic # $special) | @escape
@escape = '\\' ($printable | 'x' $hexdig+ | 'o' $octdig+ | $digit+)
@code = -- curly braces surrounding a Haskell code fragmentSyntax of Alex filesIn the following description of the Alex syntax, we use an
extended form of BNF, where optional phrases are enclosed in
square brackets ([ ... ]), and phrases which
may be repeated zero or more times are enclosed in braces
({ ... }). Literal text is enclosed in
single quotes.An Alex lexical specification is normally placed in a file
with a .x extension. Alex source files are
encoded in UTF-8, just like Haskell source
filesStrictly speaking, GHC source
files..
The overall layout of an Alex file is:alex := [ @code ] [ wrapper ] [ encoding ] { macrodef } @id ':-' { rule } [ @code ]The file begins and ends with optional code fragments.
These code fragments are copied verbatim into the generated
source file.At the top of the file, the code fragment is normally used
to declare the module name and some imports, and that is all it
should do: don't declare any functions or types in the top code
fragment, because Alex may need to inject some imports of its
own into the generated lexer code, and it does this by adding
them directly after this code fragment in the output
file.Next comes an optional directives sectionThe first kind of directive is a specification:wrapper := '%wrapper' @stringwrappers are described in . This
can be followed by an optional encoding declaration:encoding := '%encoding' @stringencodings are described in .Additionally, you can specify a token type, a typeclass,
or an action type (depending on what wrapper you use):action type := '%action' @stringtoken type := '%token' @stringtypeclass(es) := '%typeclass' @stringthese are described in .Macro definitionsNext, the lexer specification can contain a series of
macro definitions. There are two kinds of macros,
character set macros, which begin with
a $, and regular expression
macros, which begin with a @.
A character set macro can be used wherever a character set is
valid (see ), and a regular
expression macro can be used wherever a regular expression is
valid (see ).macrodef := @smac '=' set
| @rmac '=' regexpRulesThe rules are heralded by the sequence
‘id :-’
in the file. It doesn't matter what you use for the
identifier, it is just there for documentation purposes. In
fact, it can be omitted, but the :- must be
left in.The syntax of rules is as follows:rule := [ startcodes ] token
| startcodes '{' { token } '}'
token := [ left_ctx ] regexp [ right_ctx ] rhs
rhs := @code | ';'Each rule defines one token in the lexical
specification. When the input stream matches the regular
expression in a rule, the Alex lexer will return the value of
the expression on the right hand side, which we call the
action. The action can be any Haskell
expression. Alex only places one restriction on actions: all
the actions must have the same type. They can be values in a
token type, for example, or possibly operations in a monad.
More about how this all works is in .The action may be missing, indicated by replacing it
with ‘;’, in which case the
token will be skipped in the input stream.Alex will always find the longest match. For example,
if we have a rule that matches whitespace:$white+ ;Then this rule will match as much whitespace at the
beginning of the input stream as it can. Be careful: if we
had instead written this rule as$white* ;then it would also match the empty string, which would
mean that Alex could never fail to match a rule!When the input stream matches more than one rule, the
rule which matches the longest prefix of the input stream
wins. If there are still several rules which match an equal
number of characters, then the rule which appears earliest in
the file wins.ContextsAlex allows a left and right context to be placed on
any rule:
left_ctx := '^'
| set '^'
right_ctx := '$'
| '/' regexp
| '/' @code
The left context matches the character which
immediately precedes the token in the input stream. The
character immediately preceding the beginning of the stream
is assumed to be ‘\n’. The
special left-context ‘^’ is
shorthand for ‘\n^’.Right context is rather more general. There are three
forms:/ regexpThis right-context causes the rule to match if
and only if it is followed in the input stream by text
which matches
regexp.NOTE: this should be used sparingly, because it
can have a serious impact on performance. Any time
this rule could match, its
right-context will be checked against the current
input stream.$Equivalent to
‘/\n’./ { ... }This form is called a
predicate on the rule. The
Haskell expression inside the curly braces should have
type:
{ ... } :: user -- predicate state
-> AlexInput -- input stream before the token
-> Int -- length of the token
-> AlexInput -- input stream after the token
-> Bool -- True <=> accept the token
Alex will only accept the token as matching if
the predicate returns True.See for the meaning of the
AlexInput type. The
user argument is available for
passing into the lexer a special state which is used
by predicates; to give this argument a value, the
alexScanUser entry point to the
lexer must be used (see ).Start codesStart codes are a way of adding state to a lexical
specification, such that only certain rules will match for a
given state.A startcode is simply an identifier, or the special
start code ‘0’. Each rule
may be given a list of startcodes under which it
applies:startcode := @id | '0'
startcodes := '<' startcode { ',' startcode } '>'When the lexer is invoked to scan the next token from
the input stream, the start code to use is also specified
(see ). Only rules that mention this
start code are then enabled. Rules which do not have a list
of startcodes are available all the time.Each distinct start code mentioned in the lexical
specification causes a definition of the same name to be
inserted in the generated source file, whose value is of
type Int. For example, if we mentioned
startcodes foo and bar
in the lexical spec, then Alex will create definitions such
as:
foo = 1
bar = 2
in the output file.Another way to think of start codes is as a way to
define several different (but possibly overlapping) lexical
specifications in a single file, since each start code
corresponds to a different set of rules. In concrete terms,
each start code corresponds to a distinct initial state in
the state machine that Alex derives from the lexical
specification.Here is an example of using startcodes as states, for
collecting the characters inside a string:<0> ([^\"] | \n)* ;
<0> \" { begin string }
<string> [^\"] { stringchar }
<string> \" { begin 0 }When it sees a quotation mark, the lexer switches into
the string state and each character
thereafter causes a stringchar action,
until the next quotation mark is found, when we switch back
into the 0 state again.From the lexer's point of view, the startcode is just
an integer passed in, which tells it which state to start
in. In order to actually use it as a state, you must have
some way for the token actions to specify new start codes -
describes some ways this can be done.
In some applications, it might be necessary to keep a
stack of start codes, where at the end
of a state we pop the stack and resume parsing in the
previous state. If you want this functionality, you have to
program it yourself.Regular ExpressionRegular expressions are the patterns that Alex uses to match
tokens in the input stream.Syntax of regular expressionsregexp := rexp2 { '|' rexp2 }
rexp2 := rexp1 { rexp1 }
rexp1 := rexp0 [ '*' | '+' | '?' | repeat ]
rexp0 := set
| @rmac
| @string
| '(' [ regexp ] ')'
repeat := '{' $digit '}'
| '{' $digit ',' '}'
| '{' $digit ',' $digit '}'The syntax of regular expressions is fairly standard, the
only difference from normal lex-style regular expressions being
that we allow the sequence () to denote the
regular expression that matches the empty string.Spaces are ignored in a regular expression, so feel free
to space out your regular expression as much as you like, even
split it over multiple lines and include comments. Literal
whitespace can be included by surrounding it with quotes
" ", or by escaping each whitespace character
with \.setMatches any of the characters in
set. See for the syntax of sets.@fooExpands to the definition of the appropriate
regular expression macro."..."Matches the sequence of characters in the string, in
that order.r*Matches zero or more occurrences of
r.r+Matches one or more occurrences of
r.r?Matches zero or one occurrences of
r.r{n}Matches n occurrences of
r.r{n,}Matches n or more occurrences of
r.r{n,m}Matches between n and
m (inclusive) occurrences of
r.Syntax of character setsCharacter sets are the fundamental elements in a regular
expression. A character set is a pattern that matches a single
character. The syntax of character sets is as follows:set := set '#' set0
| set0
set0 := @char [ '-' @char ]
| '.'
| @smac
| '[' [^] { set } ']'
| '~' set0The various character set constructions are:charThe simplest character set is a single Unicode character.
Note that special characters such as [
and . must be escaped by prefixing them
with \ (see the lexical syntax, , for the list of special
characters).Certain non-printable characters have special escape
sequences. These are: \a,
\b, \f,
\n, \r,
\t, and \v. Other
characters can be represented by using their numerical
character values (although this may be non-portable):
\x0A is equivalent to
\n, for example.Whitespace characters are ignored; to represent a
literal space, escape it with \.char-charA range of characters can be expressed by separating
the characters with a ‘-’,
all the characters with codes in the given range are
included in the set. Character ranges can also be
non-portable..The built-in set ‘.’
matches all characters except newline
(\n).Equivalent to the set
[\x00-\x10ffff] # \n.set0 # set1Matches all the characters in
set0 that are not in
set1.[sets]The union of sets.[^sets]The complement of the union of the
sets. Equivalent to
‘. # [sets]’.~setThe complement of set.
Equivalent to ‘. # set’A set macro is written as $ followed by
an identifier. There are some builtin character set
macros:$whiteMatches all whitespace characters, including
newline.Equivalent to the set
[\ \t\n\f\v\r].$printableMatches all "printable characters". Currently this
corresponds to Unicode code points 32 to 0x10ffff,
although strictly speaking there are many non-printable
code points in this region. In the future Alex may use a
more precise definition of $printable.Character set macros can be defined at the top of the file
at the same time as regular expression macros (see ). Here are some example character set
macros:$lls = a-z -- little letters
$not_lls = ~a-z -- anything but little letters
$ls_ds = [a-zA-Z0-9] -- letters and digits
$sym = [ \! \@ \# \$ ] -- the symbols !, @, #, and $
$sym_q_nl = [ \' \! \@ \# \$ \n ] -- the above symbols with ' and newline
$quotable = $printable # \' -- any graphic character except '
$del = \127 -- ASCII DELThe Interface to an Alex-generated lexerThis section answers the question: "How do I include an
Alex lexer in my program?"Alex provides for a great deal of flexibility in how the
lexer is exposed to the rest of the program. For instance,
there's no need to parse a String directly if
you have some special character-buffer operations that avoid the
overheads of ordinary Haskell Strings. You
might want Alex to keep track of the line and column number in the
input text, or you might wish to do it yourself (perhaps you use a
different tab width from the standard 8-columns, for
example).The general story is this: Alex provides a basic interface
to the generated lexer (described in the next section), which you
can use to parse tokens given an abstract input type with
operations over it. You also have the option of including a
wrapper, which provides a higher-level
abstraction over the basic interface; Alex comes with several
wrappers.Unicode and UTF-8Lexer specifications are written in terms of Unicode
characters, but Alex works internally on a UTF-8 encoded byte
sequence.
Depending on how you use Alex, the fact that Alex uses UTF-8
encoding internally may or may not affect you. If you use one
of the wrappers (below) that takes input from a
Haskell String, then the UTF-8 encoding is
handled automatically. However, if you take input from
a ByteString, then it is your
responsibility to ensure that the input is properly UTF-8
encoded.
None of this applies if you used the
option to Alex or specify a Latin-1 encoding via a
%encoding declaration. In that case, the input is
just a sequence of 8-bit bytes, interpreted as characters in the Latin-1
character set.
The following (case-insenstive) encoding strings are currently
supported:
%encoding "latin-1"%encoding "iso-8859-1"Declare Latin-1 encoding as described above.%encoding "utf-8"%encoding "utf8"Declare UTF-8 encoding. This is
the default encoding but it may be useful to explicitly declare
this to make protect against Alex being called with the
flag.Basic interfaceIf you compile your Alex file without a
%wrapper declaration, then you get access to
the lowest-level API to the lexer. You must provide definitions
for the following, either in the same module or imported from
another module:type AlexInput
alexGetByte :: AlexInput -> Maybe (Word8,AlexInput)
alexInputPrevChar :: AlexInput -> CharThe generated lexer is independent of the input type,
which is why you have to provide a definition for the input type
yourself. Note that the input type needs to keep track of the
previous character in the input stream;
this is used for implementing patterns with a left-context
(those that begin with ^ or
set^). If you
don't ever use patterns with a left-context in your lexical
specification, then you can safely forget about the previous
character in the input stream, and have
alexInputPrevChar return
undefined.Alex will provide the following function:alexScan :: AlexInput -- The current input
-> Int -- The "start code"
-> AlexReturn action -- The return value
data AlexReturn action
= AlexEOF
| AlexError
!AlexInput -- Remaining input
| AlexSkip
!AlexInput -- Remaining input
!Int -- Token length
| AlexToken
!AlexInput -- Remaining input
!Int -- Token length
action -- action valueCalling alexScan will scan a single
token from the input stream, and return a value of type
AlexReturn. The value returned is either:AlexEOFThe end-of-file was reached.AlexErrorA valid token could not be recognised.AlexSkipThe matched token did not have an action associated
with it.AlexTokenA token was matched, and the action associated with
it is returned.The action is simply the value of the
expression inside {...} on the
right-hand-side of the appropriate rule in the Alex file.
Alex doesn't specify what type these expressions should have, it
simply requires that they all have the same type, or else you'll
get a type error when you try to compile the generated
lexer.Once you have the action, it is up to
you what to do with it. The type of action
could be a function which takes the String
representation of the token and returns a value in some token
type, or it could be a continuation that takes the new input and
calls alexScan again, building a list of
tokens as it goes.This is pretty low-level stuff; you have complete
flexibility about how you use the lexer, but there might be a
fair amount of support code to write before you can actually use
it. For this reason, we also provide a selection of wrappers
that add some common functionality to this basic scheme.
Wrappers are described in the next section.There is another entry point, which is useful if your
grammar contains any predicates (see ):alexScanUser
:: user -- predicate state
-> AlexInput -- The current input
-> Int -- The "start code"
-> AlexReturn actionThe extra argument, of some type user,
is passed to each predicate.WrappersTo use one of the provided wrappers, include the following
declaration in your file:%wrapper "name"where name is the name of the
wrapper, eg. basic. The following sections
describe each of the wrappers that come with Alex.The "basic" wrapperThe basic wrapper is a good way to obtain a function of
type String -> [token] from a lexer
specification, with little fuss.It provides definitions for
AlexInput, alexGetByte
and alexInputPrevChar that are suitable for
lexing a String input. It also provides a
function alexScanTokens which takes a
String input and returns a list of the
tokens it contains.The basic wrapper provides no support
for using startcodes; the initial startcode is always set to
zero.Here is the actual code included in the lexer when the
basic wrapper is selected:
type AlexInput = (Char, -- previous char
[Byte], -- rest of the bytes for the current char
String) -- rest of the input string
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (c,(b:bs),s) = Just (b,(c,bs,s))
alexGetByte (c,[],[]) = Nothing
alexGetByte (_,[],(c:s)) = case utf8Encode c of
(b:bs) -> Just (b, (c, bs, s))
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (c,_,_) = c
-- alexScanTokens :: String -> [token]
alexScanTokens str = go ('\n',[],str)
where go inp@(_,_bs,str) =
case alexScan inp 0 of
AlexEOF -> []
AlexError _ -> error "lexical error"
AlexSkip inp' len -> go inp'
AlexToken inp' len act -> act (take len str) : go inp'
The type signature for alexScanTokens
is commented out, because the token type is
unknown. All of the actions in your lexical specification
should have type:{ ... } :: String -> tokenfor some type token.For an example of the use of the basic wrapper, see the
file examples/Tokens.x in the Alex
distribution.The "posn" wrapperThe posn wrapper provides slightly more functionality
than the basic wrapper: it keeps track of line and column
numbers of tokens in the input text.The posn wrapper provides the following, in addition to
the straightforward definitions of
alexGetByte and
alexInputPrevChar:
data AlexPosn = AlexPn !Int -- absolute character offset
!Int -- line number
!Int -- column number
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
[Byte], -- rest of the bytes for the current char
String) -- current input string
--alexScanTokens :: String -> [token]
alexScanTokens str = go (alexStartPos,'\n',[],str)
where go inp@(pos,_,_,str) =
case alexScan inp 0 of
AlexEOF -> []
AlexError ((AlexPn _ line column),_,_,_) -> error $ "lexical error at " ++ (show line) ++ " line, " ++ (show column) ++ " column"
AlexSkip inp' len -> go inp'
AlexToken inp' len act -> act pos (take len str) : go inp'
The types of the token actions should be:{ ... } :: AlexPosn -> String -> tokenFor an example using the posn
wrapper, see the file
examples/Tokens_posn.x in the Alex
distribution.The "monad" wrapperThe monad wrapper is the most
flexible of the wrappers provided with Alex. It includes a
state monad which keeps track of the current input and text
position, and the startcode. It is intended to be a template
for building your own monads - feel free to copy the code and
modify it to build a monad with the facilities you
need.data AlexState = AlexState {
alex_pos :: !AlexPosn, -- position at current input location
alex_inp :: String, -- the current input
alex_chr :: !Char, -- the character before the input
alex_bytes :: [Byte], -- rest of the bytes for the current char
alex_scd :: !Int -- the current startcode
}
newtype Alex a = Alex { unAlex :: AlexState
-> Either String (AlexState, a) }
instance Functor Alex where ...
instance Applicative Alex where ...
instance Monad Alex where ...
runAlex :: String -> Alex a -> Either String a
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
[Byte], -- rest of the bytes for the current char
String) -- current input string
alexGetInput :: Alex AlexInput
alexSetInput :: AlexInput -> Alex ()
alexError :: String -> Alex a
alexGetStartCode :: Alex Int
alexSetStartCode :: Int -> Alex ()The monad wrapper expects that you
define a variable alexEOF with the following
signature:alexEOF :: Alex resultTo invoke a scanner under the monad
wrapper, use alexMonadScan:alexMonadScan :: Alex resultThe token actions should have the following type:type AlexAction result = AlexInput -> Int -> Alex result
{ ... } :: AlexAction resultThe Alex file must also define a function
alexEOF, which will be executed on when the
end-of-file is scanned:alexEOF :: Alex resultThe monad wrapper also provides some
useful combinators for constructing token actions:-- skip :: AlexAction result
skip input len = alexMonadScan
-- andBegin :: AlexAction result -> Int -> AlexAction result
(act `andBegin` code) input len = do alexSetStartCode code; act input len
-- begin :: Int -> AlexAction result
begin code = skip `andBegin` code
-- token :: (AlexInput -> Int -> token) -> AlexAction token
token t input len = return (t input len)The "monadUserState" wrapperThe monadUserState wrapper is built
upon the monad wrapper. It includes a reference
to a type which must be defined in the user's program,
AlexUserState, and a call to an initialization
function which must also be defined in the user's program,
alexInitUserState. It gives great flexibility
because it is now possible to add any needed information and carry
it during the whole lexing phase.The generated code is the same as in the monad
wrapper, except in 3 places:1) The definition of the general state, which now refers to a
type AlexUserState that must be defined in the Alex file.
data AlexState = AlexState {
alex_pos :: !AlexPosn, -- position at current input location
alex_inp :: String, -- the current input
alex_chr :: !Char, -- the character before the input
alex_bytes :: [Byte], -- rest of the bytes for the current char
alex_scd :: !Int, -- the current startcode
alex_ust :: AlexUserState -- AlexUserState will be defined in the user program
}
2) The initialization code, where a user-specified routine (alexInitUserState) will be
called.
runAlex :: String -> Alex a -> Either String a
runAlex input (Alex f)
= case f (AlexState {alex_pos = alexStartPos,
alex_inp = input,
alex_chr = '\n',
alex_bytes = [],
alex_ust = alexInitUserState,
alex_scd = 0}) of Left msg -> Left msg
Right ( _, a ) -> Right a
3) Two helper functions (alexGetUserState
and alexSetUserState) are defined.
alexGetUserState :: Alex AlexUserState
alexSetUserState :: AlexUserState -> Alex ()
Here is an example of code in the user's Alex file defining
the type and function:data AlexUserState = AlexUserState
{
lexerCommentDepth :: Int
, lexerStringValue :: String
}
alexInitUserState :: AlexUserState
alexInitUserState = AlexUserState
{
lexerCommentDepth = 0
, lexerStringValue = ""
}
getLexerCommentDepth :: Alex Int
getLexerCommentDepth = do ust <- alexGetUserState; return (lexerCommentDepth ust)
setLexerCommentDepth :: Int -> Alex ()
setLexerCommentDepth ss = do ust <- alexGetUserState; alexSetUserState ust{lexerCommentDepth=ss}
getLexerStringValue :: Alex String
getLexerStringValue = do ust <- alexGetUserState; return (lexerStringValue ust)
setLexerStringValue :: String -> Alex ()
setLexerStringValue ss = do ust <- alexGetUserState; alexSetUserState ust{lexerStringValue=ss}
addCharToLexerStringValue :: Char -> Alex ()
addCharToLexerStringValue c = do ust <- alexGetUserState; alexSetUserState ust{lexerStringValue=c:(lexerStringValue ust)}
The "gscan" wrapperThe gscan wrapper is provided mainly
for historical reasons: it exposes an interface which is very
similar to that provided by Alex version 1.x. The interface
is intended to be very general, allowing actions to modify the
startcode, and pass around an arbitrary state value.alexGScan :: StopAction state result -> state -> String -> result
type StopAction state result
= AlexPosn -> Char -> String -> (Int,state) -> resultThe token actions should all have this type:{ ... } :: AlexPosn -- token position
-> Char -- previous character
-> String -- input string at token
-> Int -- length of token
-> ((Int,state) -> result) -- continuation
-> (Int,state) -- current (startcode,state)
-> resultThe bytestring wrappersThe basic-bytestring,
posn-bytestring and
monad-bytestring wrappers are variations on the
basic, posn and
monad wrappers that use lazy
ByteStrings as the input and token types instead of
an ordinary String.The point of using these wrappers is that
ByteStrings provide a more memory efficient
representation of an input stream. They can also be somewhat faster to
process. Note that using these wrappers adds a dependency
on the ByteString modules, which live in the
bytestring package (or in the
base package in ghc-6.6)
As mentioned earlier (), Alex
lexers internally process a UTF-8 encoded string of bytes.
This means that the ByteString supplied
as input when using one of the ByteString wrappers should be
UTF-8 encoded (or use either the
option or the %encoding declaration).
Do note that token provides a
lazyByteString which is not
the most compact representation for short strings. You may want to
convert to a strict ByteString or perhaps something
more compact still. Note also that by default tokens share space with
the input ByteString which has the advantage that it
does not need to make a copy but it also prevents the input from being
garbage collected. It may make sense in some applications to use
ByteString's copy function to
unshare tokens that will be kept for a long time, to allow the original
input to be collected.The "basic-bytestring" wrapperThe basic-bytestring wrapper is the same as
the basic wrapper but with lazy
ByteString instead of String:
import qualified Data.ByteString.Lazy as ByteString
data AlexInput = AlexInput { alexChar :: {-# UNPACK #-} !Char, -- previous char
alexStr :: !ByteString.ByteString, -- current input string
alexBytePos :: {-# UNPACK #-} !Int64} -- bytes consumed so far
alexGetByte :: AlexInput -> Maybe (Char,AlexInput)
alexInputPrevChar :: AlexInput -> Char
-- alexScanTokens :: ByteString.ByteString -> [token]
All of the actions in your lexical specification
should have type:{ ... } :: ByteString.ByteString -> tokenfor some type token.The "posn-bytestring" wrapperThe posn-bytestring wrapper is the same as
the posn wrapper but with lazy
ByteString instead of String:
import qualified Data.ByteString.Lazy as ByteString
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
ByteString.ByteString, -- current input string
Int64) -- bytes consumed so far
-- alexScanTokens :: ByteString.ByteString -> [token]
All of the actions in your lexical specification
should have type:{ ... } :: AlexPosn -> ByteString.ByteString -> tokenfor some type token.The "monad-bytestring" wrapperThe monad-bytestring wrapper is the same as
the monad wrapper but with lazy
ByteString instead of String:
import qualified Data.ByteString.Lazy as ByteString
data AlexState = AlexState {
alex_pos :: !AlexPosn, -- position at current input location
alex_bpos:: !Int64, -- bytes consumed so far
alex_inp :: ByteString.ByteString, -- the current input
alex_chr :: !Char, -- the character before the input
alex_scd :: !Int -- the current startcode
}
newtype Alex a = Alex { unAlex :: AlexState
-> Either String (AlexState, a) }
runAlex :: ByteString.ByteString -> Alex a -> Either String a
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
ByteString.ByteString, -- current input string
Int64) -- bytes consumed so far
-- token :: (AlexInput -> Int -> token) -> AlexAction token
All of the actions in your lexical specification
have the same type as in the monad wrapper. It is
only the types of the function to run the monad and the type of the
token function that change.The "monadUserState-bytestring" wrapperThe monadUserState-bytestring wrapper is the same as
the monadUserState wrapper but with lazy
ByteString instead of String:
import qualified Data.ByteString.Lazy as ByteString
ata AlexState = AlexState {
alex_pos :: !AlexPosn, -- position at current input location
alex_bpos:: !Int64, -- bytes consumed so far
alex_inp :: ByteString.ByteString, -- the current input
alex_chr :: !Char, -- the character before the input
alex_scd :: !Int -- the current startcode
, alex_ust :: AlexUserState -- AlexUserState will be defined in the user program
}
newtype Alex a = Alex { unAlex :: AlexState
-> Either String (AlexState, a) }
runAlex :: ByteString.ByteString -> Alex a -> Either String a
-- token :: (AlexInput -> Int -> token) -> AlexAction token
All of the actions in your lexical specification
have the same type as in the monadUserState wrapper. It is
only the types of the function to run the monad and the type of the
token function that change.Type Signatures and TypeclassesThe %token, %typeclass,
and %action directives can be used to cause
Alex to emit additional type signatures in generated
code. This allows the use of typeclasses in generated lexers.Generating Type Signatures with WrappersThe %token directive can be used to
specify the token type when any kind of
%wrapper directive has been given.
Whenever %token is used, the
%typeclass directive can also be used to
specify one or more typeclass constraints. The following
shows a simple lexer that makes use of this to interpret the
meaning of tokens using the Read
typeclass:
%wrapper "basic"
%token "Token s"
%typeclass "Read s"
tokens :-
[a-zA-Z0-9]+ { mkToken }
[ \t\r\n]+ ;
{
data Token s = Tok s
mkToken :: Read s => String -> Token s
mkToken = Tok . read
lex :: Read s => String -> [Token s]
lex = alexScanTokens
}
Multiple typeclasses can be given by separating them with
commas, for example:
%typeclass "Read s, Eq s"
Generating Type Signatures without WrappersType signatures can also be generated for lexers that do
not use any wrapper. Instead of the %token
directive, the %action directive is used to
specify the type of a lexer action. The
%typeclass directive can be used to specify
the typeclass in the same way as with a wrapper. The
following example shows the use of typeclasses with a
"homegrown" monadic lexer:
{
{-# LANGUAGE FlexibleContexts #-}
module Lexer where
import Control.Monad.State
import qualified Data.Bits
import Data.Word
}
%action "AlexInput -> Int -> m (Token s)"
%typeclass "Read s, MonadState AlexState m"
tokens :-
[a-zA-Z0-9]+ { mkToken }
[ \t\n\r]+ ;
{
alexEOF :: MonadState AlexState m => m (Token s)
alexEOF = return EOF
mkToken :: (Read s, MonadState AlexState m) =>
AlexInput -> Int -> m (Token s)
mkToken (_, _, _, s) len = return (Tok (read (take len s)))
data Token s = Tok s | EOF
lex :: (MonadState AlexState m, Read s) => String -> m (Token s)
lex input = alexMonadScan
-- "Boilerplate" code from monad wrapper has been omitted
}
The %token directive may only be used
with wrapper, and the %action can only be
used when no wrapper is used.The %typeclass directive cannot be
given without the %token or
%action directive.Invoking AlexThe command line syntax for Alex is entirely
standard:$ alex { option } file.x { option }Alex expects a single
file.x to be named
on the command line. By default, Alex will create
file.hs containing
the Haskell source for the lexer.The options that Alex accepts are listed below:file=fileSpecifies the filename in which the output is to be
placed. By default, this is the name of the input file with
the .x suffix replaced by
.hs.file=fileProduces a human-readable rendition of the state
machine (DFA) that Alex derives from the lexer, in
file (default:
file.info
where the input file is
file.x).The format of the info file is currently a bit basic,
and not particularly informative.dir=dirLook in dir for template files.Causes Alex to produce a lexer which is optimised for
compiling with GHC. The lexer will be significantly more
efficient, both in terms of the size of the compiled
lexer and its runtime.Causes Alex to produce a lexer which will output
debugging messages as it runs.Disables the use of UTF-8 encoding in the generated
lexer. This has two consequences:
The Alex source file is still assumed to be UTF-8
encoded, but any Unicode characters outside the range
0-255 are mapped to Latin-1 characters by taking the
code point modulo 256.
The built-in macros $printable
and '.' range over the Latin-1
character set, not the Unicode character set.
Note that this currently does not disable the UTF-8 encoding
that happens in the "basic" wrappers,
so does not make sense in
conjunction with these wrappers (not that you would want to
do that, anyway).
Alternatively, a %encoding "latin1" declaration can be
used inside the Alex source file to request a Latin-1 mapping. See also
for more information about the
%encoding declaration.
Display help and exit.Output version information and exit. Note that for legacy
reasons is supported, too, but the use of it
is deprecated. will be used for verbose mode
when it is actually implemented.
alex-3.2.5/doc/config.mk.in 0000755 0000000 0000000 00000000654 07346545000 013623 0 ustar 00 0000000 0000000 #-----------------------------------------------------------------------------
# DocBook XML stuff
XSLTPROC = @XsltprocCmd@
XMLLINT = @XmllintCmd@
FOP = @FopCmd@
XMLTEX = @XmltexCmd@
DBLATEX = @DbLatexCmd@
DIR_DOCBOOK_XSL = @DIR_DOCBOOK_XSL@
XSLTPROC_LABEL_OPTS = --stringparam toc.section.depth 3 \
--stringparam section.autolabel 1 \
--stringparam section.label.includes.component.label 1
alex-3.2.5/doc/configure.ac 0000755 0000000 0000000 00000001155 07346545000 013703 0 ustar 00 0000000 0000000
AC_INIT([Alex docs], [1.0], [simonmar@microsoft.com], [])
AC_CONFIG_SRCDIR([Makefile])
dnl ** check for DocBook toolchain
FP_CHECK_DOCBOOK_DTD
FP_DIR_DOCBOOK_XSL([/usr/share/xml/docbook/stylesheet/nwalsh/current /usr/share/xml/docbook/stylesheet/nwalsh /usr/share/sgml/docbook/docbook-xsl-stylesheets* /usr/share/sgml/docbook/xsl-stylesheets* /opt/kde?/share/apps/ksgmltools2/docbook/xsl /usr/share/docbook-xsl /usr/share/sgml/docbkxsl /usr/local/share/xsl/docbook /sw/share/xml/xsl/docbook-xsl /usr/share/xml/docbook/xsl-stylesheets*])
AC_PATH_PROG(DbLatexCmd,dblatex)
AC_CONFIG_FILES([config.mk alex.1])
AC_OUTPUT
alex-3.2.5/doc/docbook-xml.mk 0000755 0000000 0000000 00000006332 07346545000 014166 0 ustar 00 0000000 0000000 #-----------------------------------------------------------------------------
# DocBook XML
.PHONY: html html-no-chunks chm HxS fo dvi ps pdf
ifneq "$(XML_DOC)" ""
all :: html
# multi-file XML document: main document name is specified in $(XML_DOC),
# sub-documents (.xml files) listed in $(XML_SRCS).
ifeq "$(XML_SRCS)" ""
XML_SRCS = $(wildcard *.xml)
endif
XML_HTML = $(addsuffix /index.html,$(basename $(XML_DOC)))
XML_HTML_NO_CHUNKS = $(addsuffix .html,$(XML_DOC))
XML_CHM = $(addsuffix .chm,$(XML_DOC))
XML_HxS = $(addsuffix .HxS,$(XML_DOC))
XML_DVI = $(addsuffix .dvi,$(XML_DOC))
XML_PS = $(addsuffix .ps,$(XML_DOC))
XML_PDF = $(addsuffix .pdf,$(XML_DOC))
$(XML_HTML) $(XML_NO_CHUNKS_HTML) $(XML_FO) $(XML_DVI) $(XML_PS) $(XML_PDF) :: $(XML_SRCS)
html :: $(XML_HTML)
html-no-chunks :: $(XML_HTML_NO_CHUNKS)
chm :: $(XML_CHM)
HxS :: $(XML_HxS)
dvi :: $(XML_DVI)
ps :: $(XML_PS)
pdf :: $(XML_PDF)
CLEAN_FILES += $(XML_HTML_NO_CHUNKS) $(XML_DVI) $(XML_PS) $(XML_PDF)
FPTOOLS_CSS = fptools.css
clean ::
$(RM) -rf $(XML_DOC).out $(basename $(XML_DOC)) $(basename $(XML_DOC))-htmlhelp $(XML_DOC).pdf $(XML_DOC).dvi $(XML_DOC).ps
validate ::
$(XMLLINT) --valid --noout $(XMLLINT_OPTS) $(XML_DOC).xml
endif
#-----------------------------------------------------------------------------
# DocBook XML suffix rules
#
%.html : %.xml
$(XSLTPROC) --output $@ \
--stringparam html.stylesheet $(FPTOOLS_CSS) \
$(XSLTPROC_LABEL_OPTS) $(XSLTPROC_OPTS) \
$(DIR_DOCBOOK_XSL)/html/docbook.xsl $<
%/index.html : %.xml
$(RM) -rf $(dir $@)
$(XSLTPROC) --stringparam base.dir $(dir $@) \
--stringparam use.id.as.filename 1 \
--stringparam html.stylesheet $(FPTOOLS_CSS) \
$(XSLTPROC_LABEL_OPTS) $(XSLTPROC_OPTS) \
$(DIR_DOCBOOK_XSL)/html/chunk.xsl $<
cp $(FPTOOLS_CSS) $(dir $@)
# Note: Numeric labeling seems to be uncommon for HTML Help
%-htmlhelp/index.html : %.xml
$(RM) -rf $(dir $@)
$(XSLTPROC) --stringparam base.dir $(dir $@) \
--stringparam manifest.in.base.dir 1 \
--stringparam htmlhelp.chm "..\\"$(basename $<).chm \
$(XSLTPROC_OPTS) \
$(DIR_DOCBOOK_XSL)/htmlhelp/htmlhelp.xsl $<
%-htmlhelp2/collection.HxC : %.xml
$(RM) -rf $(dir $@)
$(XSLTPROC) --stringparam base.dir $(dir $@) \
--stringparam use.id.as.filename 1 \
--stringparam manifest.in.base.dir 1 \
$(XSLTPROC_OPTS) \
$(DIR_DOCBOOK_XSL)/htmlhelp2/htmlhelp2.xsl $<
# TODO: Detect hhc & Hxcomp via autoconf
#
# Two obstacles here:
#
# * The reason for the strange "if" below is that hhc returns 0 on error and 1
# on success, the opposite of what shells and make expect.
#
# * There seems to be some trouble with DocBook indices, but the *.chm looks OK,
# anyway, therefore we pacify make by "|| true". Ugly...
#
%.chm : %-htmlhelp/index.html
( cd $(dir $<) && if hhc htmlhelp.hhp ; then false ; else true ; fi ) || true
%.HxS : %-htmlhelp2/collection.HxC
( cd $(dir $<) && if Hxcomp -p collection.HxC -o ../$@ ; then false ; else true ; fi )
ifneq "$(DBLATEX)" ""
%.pdf : %.xml
$(DBLATEX) -tpdf $<
%.dvi : %.xml
$(DBLATEX) -tdvi $<
%.ps : %.xml
$(DBLATEX) -tps $<
endif
alex-3.2.5/doc/fptools.css 0000755 0000000 0000000 00000001424 07346545000 013614 0 ustar 00 0000000 0000000 div {
font-family: sans-serif;
color: black;
background: white
}
h1, h2, h3, h4, h5, h6, p.title { color: #005A9C }
h1 { font: 170% sans-serif }
h2 { font: 140% sans-serif }
h3 { font: 120% sans-serif }
h4 { font: bold 100% sans-serif }
h5 { font: italic 100% sans-serif }
h6 { font: small-caps 100% sans-serif }
pre {
font-family: monospace;
border-width: 1px;
border-style: solid;
padding: 0.3em
}
pre.screen { color: #006400 }
pre.programlisting { color: maroon }
div.example {
background-color: #fffcf5;
margin: 1ex 0em;
border: solid #412e25 1px;
padding: 0ex 0.4em
}
a:link { color: #0000C8 }
a:hover { background: #FFFFA8 }
a:active { color: #D00000 }
a:visited { color: #680098 }
alex-3.2.5/examples/ 0000755 0000000 0000000 00000000000 07346545000 012461 5 ustar 00 0000000 0000000 alex-3.2.5/examples/Makefile 0000755 0000000 0000000 00000002523 07346545000 014126 0 ustar 00 0000000 0000000 ALEX=../dist/build/alex/alex
HC=ghc -Wall -fno-warn-unused-binds -fno-warn-missing-signatures -fno-warn-unused-matches -fno-warn-name-shadowing -fno-warn-unused-imports -fno-warn-tabs
HAPPY=happy
HAPPY_OPTS=-agc
ifeq "$(TARGETPLATFORM)" "i386-unknown-mingw32"
exeext=.exe
else
exeext=.bin
endif
PROGS = lit Tokens Tokens_gscan words words_posn words_monad tiny haskell tiger
ALEX_OPTS = --template=.. -g
# ALEX_OPTS = --template=..
%.alex.hs : %.x
$(ALEX) $(ALEX_OPTS) $< -o $@
%.happy.hs : %.y
$(HAPPY) $(HAPPY_OPTS) $< -o $@
%.o : %.hs
$(HC) $(HC_OPTS) -c -o $@ $<
CLEAN_FILES += *.info *.hi *.o *.bin *.exe
all : $(addsuffix $(exeext),$(PROGS))
tiny$(exeext) : tiny.happy.hs Tokens_posn.alex.hs
$(HC) $(HC_OPTS) -o $@ $^
lit$(exeext) : lit.alex.hs
$(HC) $(HC_OPTS) -o $@ $^
Tokens$(exeext) : Tokens.alex.hs
$(HC) $(HC_OPTS) -o $@ $^
Tokens_gscan$(exeext) : Tokens_gscan.alex.hs
$(HC) $(HC_OPTS) -o $@ $^
words$(exeext) : words.alex.hs
$(HC) $(HC_OPTS) -o $@ $^
words_posn$(exeext) : words_posn.alex.hs
$(HC) $(HC_OPTS) -o $@ $^
words_monad$(exeext) : words_monad.alex.hs
$(HC) $(HC_OPTS) -o $@ $^
haskell$(exeext) : haskell.alex.hs
$(HC) $(HC_OPTS) -o $@ $^
tiger$(exeext) : tiger.alex.hs
$(HC) $(HC_OPTS) -main-is TigerLexer -o $@ $^
.PHONY: clean
clean:
rm -f *.o *.hi $(addsuffix $(exeext),$(PROGS)) \
*.alex.hs *.happy.hs
alex-3.2.5/examples/Tokens.x 0000755 0000000 0000000 00000001155 07346545000 014122 0 ustar 00 0000000 0000000 {
module Main where
}
%wrapper "basic"
$digit = 0-9 -- digits
$alpha = [a-zA-Z] -- alphabetic characters
tokens :-
$white+ { \s -> White }
"--".* { \s -> Comment }
let { \s -> Let }
in { \s -> In }
$digit+ { \s -> Int (read s) }
[\=\+\-\*\/\(\)] { \s -> Sym (head s) }
$alpha [$alpha $digit \_ \']* { \s -> Var s }
{
-- Each right-hand side has type :: String -> Token
-- The token type:
data Token =
White |
Comment |
Let |
In |
Sym Char |
Var String |
Int Int |
Err
deriving (Eq,Show)
main = do
s <- getContents
print (alexScanTokens s)
}
alex-3.2.5/examples/Tokens_gscan.x 0000755 0000000 0000000 00000001432 07346545000 015273 0 ustar 00 0000000 0000000 {
module Main (main) where
}
%wrapper "gscan"
$digit = 0-9 -- digits
$alpha = [a-zA-Z] -- alphabetic characters
tokens :-
$white+ ;
"--".* ;
let { tok (\p s -> Let p) }
in { tok (\p s -> In p) }
$digit+ { tok (\p s -> Int p (read s)) }
[\=\+\-\*\/\(\)] { tok (\p s -> Sym p (head s)) }
$alpha [$alpha $digit \_ \']* { tok (\p s -> Var p s) }
{
-- Some action helpers:
tok f p c str len cont (sc,state) = f p (take len str) : cont (sc,state)
-- The token type:
data Token =
Let AlexPosn |
In AlexPosn |
Sym AlexPosn Char |
Var AlexPosn String |
Int AlexPosn Int |
Err AlexPosn
deriving (Eq,Show)
main = do
s <- getContents
print (alexGScan stop undefined s)
where
stop p c "" (sc,s) = []
stop p c _ (sc,s) = error "lexical error"
}
alex-3.2.5/examples/Tokens_posn.x 0000755 0000000 0000000 00000001462 07346545000 015162 0 ustar 00 0000000 0000000 {
module Tokens_posn (Token(..), AlexPosn(..), alexScanTokens, token_posn) where
}
%wrapper "posn"
$digit = 0-9 -- digits
$alpha = [a-zA-Z] -- alphabetic characters
tokens :-
$white+ ;
"--".* ;
let { tok (\p s -> Let p) }
in { tok (\p s -> In p) }
$digit+ { tok (\p s -> Int p (read s)) }
[\=\+\-\*\/\(\)] { tok (\p s -> Sym p (head s)) }
$alpha [$alpha $digit \_ \']* { tok (\p s -> Var p s) }
{
-- Each right-hand side has type :: AlexPosn -> String -> Token
-- Some action helpers:
tok f p s = f p s
-- The token type:
data Token =
Let AlexPosn |
In AlexPosn |
Sym AlexPosn Char |
Var AlexPosn String |
Int AlexPosn Int
deriving (Eq,Show)
token_posn (Let p) = p
token_posn (In p) = p
token_posn (Sym p _) = p
token_posn (Var p _) = p
token_posn (Int p _) = p
}
alex-3.2.5/examples/examples.x 0000755 0000000 0000000 00000001261 07346545000 014473 0 ustar 00 0000000 0000000 "example_rexps":-
::= $ | a+ -- = a*, zero or more as
::= aa* -- = a+, one or more as
::= $ | a -- = a?, zero or one as
::= a{3} -- = aaa, three as
::= a{3,5} -- = a{3}a?a?
::= a{3,} -- = a{3}a*
"example_sets":-
::= a-z -- little letters
::= ~a-z -- anything but little letters
::= [a-zA-Z0-9] -- letters and digits
::= `!@@#$' -- the symbols !, @@, # and $
::= [`!#@@$'^'^n] -- the above symbols with ' and newline
::= ^p#^' -- any graphic character except '
::= ^127 -- ASCII DEL
alex-3.2.5/examples/haskell.x 0000755 0000000 0000000 00000010440 07346545000 014277 0 ustar 00 0000000 0000000 --
-- Lexical syntax for Haskell 98.
--
-- (c) Simon Marlow 2003, with the caveat that much of this is
-- translated directly from the syntax in the Haskell 98 report.
--
-- This isn't a complete Haskell 98 lexer - it doesn't handle layout
-- for one thing. However, it could be adapted with a small
-- amount of effort.
--
{
module Main (main) where
import Data.Char (chr)
}
%wrapper "monad"
$whitechar = [ \t\n\r\f\v]
$special = [\(\)\,\;\[\]\`\{\}]
$ascdigit = 0-9
$unidigit = [] -- TODO
$digit = [$ascdigit $unidigit]
$ascsymbol = [\!\#\$\%\&\*\+\.\/\<\=\>\?\@\\\^\|\-\~]
$unisymbol = [] -- TODO
$symbol = [$ascsymbol $unisymbol] # [$special \_\:\"\']
$large = [A-Z \xc0-\xd6 \xd8-\xde]
$small = [a-z \xdf-\xf6 \xf8-\xff \_]
$alpha = [$small $large]
$graphic = [$small $large $symbol $digit $special \:\"\']
$octit = 0-7
$hexit = [0-9 A-F a-f]
$idchar = [$alpha $digit \']
$symchar = [$symbol \:]
$nl = [\n\r]
@reservedid =
as|case|class|data|default|deriving|do|else|hiding|if|
import|in|infix|infixl|infixr|instance|let|module|newtype|
of|qualified|then|type|where
@reservedop =
".." | ":" | "::" | "=" | \\ | "|" | "<-" | "->" | "@" | "~" | "=>"
@varid = $small $idchar*
@conid = $large $idchar*
@varsym = $symbol $symchar*
@consym = \: $symchar*
@decimal = $digit+
@octal = $octit+
@hexadecimal = $hexit+
@exponent = [eE] [\-\+] @decimal
$cntrl = [$large \@\[\\\]\^\_]
@ascii = \^ $cntrl | NUL | SOH | STX | ETX | EOT | ENQ | ACK
| BEL | BS | HT | LF | VT | FF | CR | SO | SI | DLE
| DC1 | DC2 | DC3 | DC4 | NAK | SYN | ETB | CAN | EM
| SUB | ESC | FS | GS | RS | US | SP | DEL
$charesc = [abfnrtv\\\"\'\&]
@escape = \\ ($charesc | @ascii | @decimal | o @octal | x @hexadecimal)
@gap = \\ $whitechar+ \\
@string = $graphic # [\"\\] | " " | @escape | @gap
haskell :-
<0> $white+ { skip }
<0> "--"\-*[^$symbol].* { skip }
"{-" { nested_comment }
<0> $special { mkL LSpecial }
<0> @reservedid { mkL LReservedId }
<0> @conid \. @varid { mkL LQVarId }
<0> @conid \. @conid { mkL LQConId }
<0> @varid { mkL LVarId }
<0> @conid { mkL LConId }
<0> @reservedop { mkL LReservedOp }
<0> @conid \. @varsym { mkL LVarSym }
<0> @conid \. @consym { mkL LConSym }
<0> @varsym { mkL LVarSym }
<0> @consym { mkL LConSym }
<0> @decimal
| 0[oO] @octal
| 0[xX] @hexadecimal { mkL LInteger }
<0> @decimal \. @decimal @exponent?
| @decimal @exponent { mkL LFloat }
<0> \' ($graphic # [\'\\] | " " | @escape) \'
{ mkL LChar }
<0> \" @string* \" { mkL LString }
{
data Lexeme = L AlexPosn LexemeClass String
data LexemeClass
= LInteger
| LFloat
| LChar
| LString
| LSpecial
| LReservedId
| LReservedOp
| LVarId
| LQVarId
| LConId
| LQConId
| LVarSym
| LQVarSym
| LConSym
| LQConSym
| LEOF
deriving Eq
mkL :: LexemeClass -> AlexInput -> Int -> Alex Lexeme
mkL c (p,_,_,str) len = return (L p c (take len str))
nested_comment :: AlexInput -> Int -> Alex Lexeme
nested_comment _ _ = do
input <- alexGetInput
go 1 input
where go 0 input = do alexSetInput input; alexMonadScan
go n input = do
case alexGetByte input of
Nothing -> err input
Just (c,input) -> do
case chr (fromIntegral c) of
'-' -> do
let temp = input
case alexGetByte input of
Nothing -> err input
Just (125,input) -> go (n-1) input
Just (45, input) -> go n temp
Just (c,input) -> go n input
'\123' -> do
case alexGetByte input of
Nothing -> err input
Just (c,input) | c == fromIntegral (ord '-') -> go (n+1) input
Just (c,input) -> go n input
c -> go n input
err input = do alexSetInput input; lexError "error in nested comment"
lexError s = do
(p,c,_,input) <- alexGetInput
alexError (showPosn p ++ ": " ++ s ++
(if (not (null input))
then " before " ++ show (head input)
else " at end of file"))
scanner str = runAlex str $ do
let loop i = do tok@(L _ cl _) <- alexMonadScan;
if cl == LEOF
then return i
else do loop $! (i+1)
loop 0
alexEOF = return (L undefined LEOF "")
showPosn (AlexPn _ line col) = show line ++ ':': show col
main = do
s <- getContents
print (scanner s)
}
alex-3.2.5/examples/lit.x 0000755 0000000 0000000 00000001713 07346545000 013447 0 ustar 00 0000000 0000000 {
{-# LANGUAGE NPlusKPatterns #-}
module Main (main) where
}
%wrapper "gscan"
$space = $white # \n
@blank = \n $space*
@scrap = \n \> .*
@comment = \n ( [^ \> $white] | $space+ ~$white ) .*
lit :-
@blank @scrap+ { scrap }
@blank @comment* { comment }
{
scrap _ _ inp len cont st = strip len inp
where
strip 0 _ = cont st
strip (n+1) (c:rst) =
if c=='\n'
then '\n':strip_nl n rst
else c:strip n rst
strip_nl (n+1) ('>':rst) = ' ':strip n rst
strip_nl n rst = strip n rst
comment _ _ inp len cont st = strip len inp
where
strip 0 _ = cont st
strip (n+1) (c:rst) = if c=='\n' then c:strip n rst else strip n rst
main:: IO ()
main = interact literate
literate:: String -> String
literate inp = drop 2 (alexGScan stop_act () ('\n':'\n':inp))
stop_act p _ "" st = []
stop_act p _ _ _ = error (msg ++ loc p ++ "\n")
msg = "literate preprocessing error at "
loc (AlexPn _ l c) = "line " ++ show(l-2) ++ ", column " ++ show c
}
alex-3.2.5/examples/pp.x 0000755 0000000 0000000 00000001313 07346545000 013272 0 ustar 00 0000000 0000000 %{
import System
import Char
import Alex
%}
"pp_lx"/"pp_acts":-
{ ^s = ^w#^n } -- spaces and tabs, etc.
{ ^f = [A-Za-z0-9`~%-_.,/'] } -- file name character
::= ^#include^s+^"^f+^"^s*^n
::= .*^n
%{
inc p c inp len cont st = pp fn >> cont st
where
fn = (takeWhile ('"'/=) . tail . dropWhile isSpace . drop 8) inp
txt p c inp len cont st = putStr (take len inp) >> cont st
main:: IO ()
main = getArgs >>= \args ->
case args of
[fn] -> pp fn
_ -> error "usage: pp file\n"
pp:: String -> IO ()
pp fn = readFile fn >>= \cts -> gscan pp_scan () cts
pp_scan:: GScan () (IO ())
pp_scan = load_gscan (pp_acts,stop_act) pp_lx
where
stop_act _ _ _ _ = return ()
%}
alex-3.2.5/examples/state.x 0000755 0000000 0000000 00000001071 07346545000 013774 0 ustar 00 0000000 0000000 {
module Main (main) where
}
%wrapper "gscan"
state :-
$white+ { skip }
\{ [^\}]* \} { code }
[A-Za-z]+ { ide }
{
code _ _ inp len cont (sc,frags) = cont (sc,frag:frags)
where
frag = take (len-4) (drop 2 inp)
ide _ _ inp len cont st = Ide (take len inp):cont st
skip _ _ inp len cont st = cont st
data Token = Ide String | Eof String | Err deriving Show
stop_act _ _ "" (_,frags) = [Eof (unlines(reverse frags))]
stop_act _ _ _ _ = [Err]
tokens:: String -> [Token]
tokens inp = alexGScan stop_act [] inp
main:: IO ()
main = interact (show.tokens)
}
alex-3.2.5/examples/tiny.y 0000755 0000000 0000000 00000002573 07346545000 013650 0 ustar 00 0000000 0000000 -- An example demonstrating how to connect a Happy parser to an Alex lexer.
{
import Tokens_posn
}
%name calc
%tokentype { Token }
%token let { Let _ }
in { In _ }
int { Int _ $$ }
var { Var _ $$ }
'=' { Sym _ '=' }
'+' { Sym _ '+' }
'-' { Sym _ '-' }
'*' { Sym _ '*' }
'/' { Sym _ '/' }
'(' { Sym _ '(' }
')' { Sym _ ')' }
%%
Exp :: { Exp }
Exp : let var '=' Exp in Exp { LetE $2 $4 $6 }
| Exp1 { $1 }
Exp1 : Exp1 '+' Term { PlusE $1 $3 }
| Exp1 '-' Term { MinusE $1 $3 }
| Term { $1 }
Term : Term '*' Factor { TimesE $1 $3 }
| Term '/' Factor { DivE $1 $3 }
| Factor { $1 }
Factor : '-' Atom { NegE $2 }
| Atom { $1 }
Atom : int { IntE $1 }
| var { VarE $1 }
| '(' Exp ')' { $2 }
{
data Exp =
LetE String Exp Exp |
PlusE Exp Exp |
MinusE Exp Exp |
TimesE Exp Exp |
DivE Exp Exp |
NegE Exp |
IntE Int |
VarE String
deriving Show
main:: IO ()
main = interact (show.runCalc)
runCalc :: String -> Exp
runCalc = calc . alexScanTokens
happyError :: [Token] -> a
happyError tks = error ("Parse error at " ++ lcn ++ "\n")
where
lcn = case tks of
[] -> "end of file"
tk:_ -> "line " ++ show l ++ ", column " ++ show c
where
AlexPn _ l c = token_posn tk
}
alex-3.2.5/examples/words.x 0000755 0000000 0000000 00000000351 07346545000 014012 0 ustar 00 0000000 0000000 -- Performance test; run with input /usr/dict/words, for example
{
module Main (main) where
}
%wrapper "basic"
words :-
$white+ ;
[A-Za-z0-9\'\-]+ { \s -> () }
{
main = do
s <- getContents
print (length (alexScanTokens s))
}
alex-3.2.5/examples/words_monad.x 0000755 0000000 0000000 00000000762 07346545000 015176 0 ustar 00 0000000 0000000 -- Performance test; run with input /usr/dict/words, for example
{
module Main (main) where
}
%wrapper "monad"
words :-
$white+ { skip }
[A-Za-z0-9\'\-]+ { word }
{
word (_,_,_,input) len = return (take len input)
scanner str = runAlex str $ do
let loop i = do tok <- alexMonadScan
if tok == "stopped." || tok == "error."
then return i
else do let i' = i+1 in i' `seq` loop i'
loop 0
alexEOF = return "stopped."
main = do
s <- getContents
print (scanner s)
}
alex-3.2.5/examples/words_posn.x 0000755 0000000 0000000 00000000352 07346545000 015052 0 ustar 00 0000000 0000000 -- Performance test; run with input /usr/dict/words, for example
{
module Main (main) where
}
%wrapper "posn"
words :-
$white+ ;
[A-Za-z0-9\'\-]+ { \p s -> () }
{
main = do
s <- getContents
print (length (alexScanTokens s))
}
alex-3.2.5/src/ 0000755 0000000 0000000 00000000000 07346545000 011432 5 ustar 00 0000000 0000000 alex-3.2.5/src/AbsSyn.hs 0000644 0000000 0000000 00000033360 07346545000 013172 0 ustar 00 0000000 0000000 -- -----------------------------------------------------------------------------
--
-- AbsSyn.hs, part of Alex
--
-- (c) Chris Dornan 1995-2000, Simon Marlow 2003
--
-- This module provides a concrete representation for regular expressions and
-- scanners. Scanners are used for tokenising files in preparation for parsing.
--
-- ----------------------------------------------------------------------------}
module AbsSyn (
Code, Directive(..), Scheme(..),
wrapperName,
Scanner(..),
RECtx(..),
RExp(..),
DFA(..), State(..), SNum, StartCode, Accept(..),
RightContext(..), showRCtx, strtype,
encodeStartCodes, extractActions,
Target(..),
UsesPreds(..), usesPreds,
StrType(..)
) where
import CharSet ( CharSet, Encoding )
import Map ( Map )
import qualified Map hiding ( Map )
import Data.IntMap (IntMap)
import Sort ( nub' )
import Util ( str, nl )
import Data.Maybe ( fromJust )
infixl 4 :|
infixl 5 :%%
-- -----------------------------------------------------------------------------
-- Abstract Syntax for Alex scripts
type Code = String
data Directive
= WrapperDirective String -- use this wrapper
| EncodingDirective Encoding -- use this encoding
| ActionType String -- Type signature of actions,
-- with optional typeclasses
| TypeClass String
| TokenType String
deriving Show
data StrType = Str | Lazy | Strict
instance Show StrType where
show Str = "String"
show Lazy = "ByteString.ByteString"
show Strict = "ByteString.ByteString"
data Scheme
= Default { defaultTypeInfo :: Maybe (Maybe String, String) }
| GScan { gscanTypeInfo :: Maybe (Maybe String, String) }
| Basic { basicStrType :: StrType,
basicTypeInfo :: Maybe (Maybe String, String) }
| Posn { posnByteString :: Bool,
posnTypeInfo :: Maybe (Maybe String, String) }
| Monad { monadByteString :: Bool, monadUserState :: Bool,
monadTypeInfo :: Maybe (Maybe String, String) }
strtype :: Bool -> String
strtype True = "ByteString.ByteString"
strtype False = "String"
wrapperName :: Scheme -> Maybe String
wrapperName Default {} = Nothing
wrapperName GScan {} = Just "gscan"
wrapperName Basic { basicStrType = Str } = Just "basic"
wrapperName Basic { basicStrType = Lazy } = Just "basic-bytestring"
wrapperName Basic { basicStrType = Strict } = Just "strict-bytestring"
wrapperName Posn { posnByteString = False } = Just "posn"
wrapperName Posn { posnByteString = True } = Just "posn-bytestring"
wrapperName Monad { monadByteString = False,
monadUserState = False } = Just "monad"
wrapperName Monad { monadByteString = True,
monadUserState = False } = Just "monad-bytestring"
wrapperName Monad { monadByteString = False,
monadUserState = True } = Just "monadUserState"
wrapperName Monad { monadByteString = True,
monadUserState = True } = Just "monadUserState-bytestring"
-- TODO: update this comment
--
-- A `Scanner' consists of an association list associating token names with
-- regular expressions with context. The context may include a list of start
-- codes, some leading context to test the character immediately preceding the
-- token and trailing context to test the residual input after the token.
--
-- The start codes consist of the names and numbers of the start codes;
-- initially the names only will be generated by the parser, the numbers being
-- allocated at a later stage. Start codes become meaningful when scanners are
-- converted to DFAs; see the DFA section of the Scan module for details.
data Scanner = Scanner { scannerName :: String,
scannerTokens :: [RECtx] }
deriving Show
data RECtx = RECtx { reCtxStartCodes :: [(String,StartCode)],
reCtxPreCtx :: Maybe CharSet,
reCtxRE :: RExp,
reCtxPostCtx :: RightContext RExp,
reCtxCode :: Maybe Code
}
data RightContext r
= NoRightContext
| RightContextRExp r
| RightContextCode Code
deriving (Eq,Ord)
instance Show RECtx where
showsPrec _ (RECtx scs _ r rctx code) =
showStarts scs . shows r . showRCtx rctx . showMaybeCode code
showMaybeCode :: Maybe String -> String -> String
showMaybeCode Nothing = id
showMaybeCode (Just code) = showCode code
showCode :: String -> String -> String
showCode code = showString " { " . showString code . showString " }"
showStarts :: [(String, StartCode)] -> String -> String
showStarts [] = id
showStarts scs = shows scs
showRCtx :: Show r => RightContext r -> String -> String
showRCtx NoRightContext = id
showRCtx (RightContextRExp r) = ('\\':) . shows r
showRCtx (RightContextCode code) = showString "\\ " . showCode code
-- -----------------------------------------------------------------------------
-- DFAs
data DFA s a = DFA
{ dfa_start_states :: [s],
dfa_states :: Map s (State s a)
}
data State s a = State { state_acc :: [Accept a],
state_out :: IntMap s -- 0..255 only
}
type SNum = Int
data Accept a
= Acc { accPrio :: Int,
accAction :: Maybe a,
accLeftCtx :: Maybe CharSet, -- cannot be converted to byteset at this point.
accRightCtx :: RightContext SNum
}
deriving (Eq,Ord)
-- debug stuff
instance Show (Accept a) where
showsPrec _ (Acc p _act _lctx _rctx) = shows p --TODO
type StartCode = Int
-- -----------------------------------------------------------------------------
-- Predicates / contexts
-- we can generate somewhat faster code in the case that
-- the lexer doesn't use predicates
data UsesPreds = UsesPreds | DoesntUsePreds
usesPreds :: DFA s a -> UsesPreds
usesPreds dfa
| any acceptHasCtx [ acc | st <- Map.elems (dfa_states dfa)
, acc <- state_acc st ]
= UsesPreds
| otherwise
= DoesntUsePreds
where
acceptHasCtx Acc { accLeftCtx = Nothing
, accRightCtx = NoRightContext } = False
acceptHasCtx _ = True
-- -----------------------------------------------------------------------------
-- Regular expressions
-- `RExp' provides an abstract syntax for regular expressions. `Eps' will
-- match empty strings; `Ch p' matches strings containinng a single character
-- `c' if `p c' is true; `re1 :%% re2' matches a string if `re1' matches one of
-- its prefixes and `re2' matches the rest; `re1 :| re2' matches a string if
-- `re1' or `re2' matches it; `Star re', `Plus re' and `Ques re' can be
-- expressed in terms of the other operators. See the definitions of `ARexp'
-- for a formal definition of the semantics of these operators.
data RExp
= Eps
| Ch CharSet
| RExp :%% RExp
| RExp :| RExp
| Star RExp
| Plus RExp
| Ques RExp
instance Show RExp where
showsPrec _ Eps = showString "()"
showsPrec _ (Ch _) = showString "[..]"
showsPrec _ (l :%% r) = shows l . shows r
showsPrec _ (l :| r) = shows l . ('|':) . shows r
showsPrec _ (Star r) = shows r . ('*':)
showsPrec _ (Plus r) = shows r . ('+':)
showsPrec _ (Ques r) = shows r . ('?':)
{------------------------------------------------------------------------------
Abstract Regular Expression
------------------------------------------------------------------------------}
-- This section contains demonstrations; it is not part of Alex.
{-
-- This function illustrates `ARexp'. It returns true if the string in its
-- argument is matched by the regular expression.
recognise:: RExp -> String -> Bool
recognise re inp = any (==len) (ap_ar (arexp re) inp)
where
len = length inp
-- `ARexp' provides an regular expressions in abstract format. Here regular
-- expressions are represented by a function that takes the string to be
-- matched and returns the sizes of all the prefixes matched by the regular
-- expression (the list may contain duplicates). Each of the `RExp' operators
-- are represented by similarly named functions over ARexp. The `ap' function
-- takes an `ARExp', a string and returns the sizes of all the prefixes
-- matching that regular expression. `arexp' converts an `RExp' to an `ARexp'.
arexp:: RExp -> ARexp
arexp Eps = eps_ar
arexp (Ch p) = ch_ar p
arexp (re :%% re') = arexp re `seq_ar` arexp re'
arexp (re :| re') = arexp re `bar_ar` arexp re'
arexp (Star re) = star_ar (arexp re)
arexp (Plus re) = plus_ar (arexp re)
arexp (Ques re) = ques_ar (arexp re)
star_ar:: ARexp -> ARexp
star_ar sc = eps_ar `bar_ar` plus_ar sc
plus_ar:: ARexp -> ARexp
plus_ar sc = sc `seq_ar` star_ar sc
ques_ar:: ARexp -> ARexp
ques_ar sc = eps_ar `bar_ar` sc
-- Hugs abstract type definition -- not for GHC.
type ARexp = String -> [Int]
-- in ap_ar, eps_ar, ch_ar, seq_ar, bar_ar
ap_ar:: ARexp -> String -> [Int]
ap_ar sc = sc
eps_ar:: ARexp
eps_ar inp = [0]
ch_ar:: (Char->Bool) -> ARexp
ch_ar p "" = []
ch_ar p (c:rst) = if p c then [1] else []
seq_ar:: ARexp -> ARexp -> ARexp
seq_ar sc sc' inp = [n+m| n<-sc inp, m<-sc' (drop n inp)]
bar_ar:: ARexp -> ARexp -> ARexp
bar_ar sc sc' inp = sc inp ++ sc' inp
-}
-- -----------------------------------------------------------------------------
-- Utils
-- Map the available start codes onto [1..]
encodeStartCodes:: Scanner -> (Scanner,[StartCode],ShowS)
encodeStartCodes scan = (scan', 0 : map snd name_code_pairs, sc_hdr)
where
scan' = scan{ scannerTokens = map mk_re_ctx (scannerTokens scan) }
mk_re_ctx (RECtx scs lc re rc code)
= RECtx (map mk_sc scs) lc re rc code
mk_sc (nm,_) = (nm, if nm=="0" then 0
else fromJust (Map.lookup nm code_map))
sc_hdr tl =
case name_code_pairs of
[] -> tl
(nm,_):rst -> "\n" ++ nm ++ foldr f t rst
where
f (nm', _) t' = "," ++ nm' ++ t'
t = " :: Int\n" ++ foldr fmt_sc tl name_code_pairs
where
fmt_sc (nm,sc) t = nm ++ " = " ++ show sc ++ "\n" ++ t
code_map = Map.fromList name_code_pairs
name_code_pairs = zip (nub' (<=) nms) [1..]
nms = [nm | RECtx{reCtxStartCodes = scs} <- scannerTokens scan,
(nm,_) <- scs, nm /= "0"]
-- Grab the code fragments for the token actions, and replace them
-- with function names of the form alex_action_$n$. We do this
-- because the actual action fragments might be duplicated in the
-- generated file.
extractActions :: Scheme -> Scanner -> (Scanner,ShowS)
extractActions scheme scanner = (scanner{scannerTokens = new_tokens}, decl_str)
where
(new_tokens, decls) = unzip (zipWith f (scannerTokens scanner) act_names)
f r@RECtx{ reCtxCode = Just code } name
= (r{reCtxCode = Just name}, Just (mkDecl name code))
f r@RECtx{ reCtxCode = Nothing } _
= (r{reCtxCode = Nothing}, Nothing)
gscanActionType res =
str "AlexPosn -> Char -> String -> Int -> ((Int, state) -> "
. str res . str ") -> (Int, state) -> " . str res
mkDecl fun code = case scheme of
Default { defaultTypeInfo = Just (Nothing, actionty) } ->
str fun . str " :: " . str actionty . str "\n"
. str fun . str " = " . str code . nl
Default { defaultTypeInfo = Just (Just tyclasses, actionty) } ->
str fun . str " :: (" . str tyclasses . str ") => " .
str actionty . str "\n" .
str fun . str " = " . str code . nl
GScan { gscanTypeInfo = Just (Nothing, tokenty) } ->
str fun . str " :: " . gscanActionType tokenty . str "\n"
. str fun . str " = " . str code . nl
GScan { gscanTypeInfo = Just (Just tyclasses, tokenty) } ->
str fun . str " :: (" . str tyclasses . str ") => " .
gscanActionType tokenty . str "\n" .
str fun . str " = " . str code . nl
Basic { basicStrType = strty, basicTypeInfo = Just (Nothing, tokenty) } ->
str fun . str " :: " . str (show strty) . str " -> "
. str tokenty . str "\n"
. str fun . str " = " . str code . nl
Basic { basicStrType = strty,
basicTypeInfo = Just (Just tyclasses, tokenty) } ->
str fun . str " :: (" . str tyclasses . str ") => " .
str (show strty) . str " -> " . str tokenty . str "\n" .
str fun . str " = " . str code . nl
Posn { posnByteString = isByteString,
posnTypeInfo = Just (Nothing, tokenty) } ->
str fun . str " :: AlexPosn -> " . str (strtype isByteString) . str " -> "
. str tokenty . str "\n"
. str fun . str " = " . str code . nl
Posn { posnByteString = isByteString,
posnTypeInfo = Just (Just tyclasses, tokenty) } ->
str fun . str " :: (" . str tyclasses . str ") => AlexPosn -> " .
str (strtype isByteString) . str " -> " . str tokenty . str "\n" .
str fun . str " = " . str code . nl
Monad { monadByteString = isByteString,
monadTypeInfo = Just (Nothing, tokenty) } ->
let
actintty = if isByteString then "Int64" else "Int"
in
str fun . str " :: AlexInput -> " . str actintty . str " -> Alex ("
. str tokenty . str ")\n"
. str fun . str " = " . str code . nl
Monad { monadByteString = isByteString,
monadTypeInfo = Just (Just tyclasses, tokenty) } ->
let
actintty = if isByteString then "Int64" else "Int"
in
str fun . str " :: (" . str tyclasses . str ") => "
. str " AlexInput -> " . str actintty
. str " -> Alex (" . str tokenty . str ")\n"
. str fun . str " = " . str code . nl
_ -> str fun . str " = " . str code . nl
act_names = map (\n -> "alex_action_" ++ show (n::Int)) [0..]
decl_str = foldr (.) id [ decl | Just decl <- decls ]
-- -----------------------------------------------------------------------------
-- Code generation targets
data Target = GhcTarget | HaskellTarget
alex-3.2.5/src/CharSet.hs 0000644 0000000 0000000 00000012274 07346545000 013325 0 ustar 00 0000000 0000000 -- -----------------------------------------------------------------------------
--
-- CharSet.hs, part of Alex
--
-- (c) Chris Dornan 1995-2000, Simon Marlow 2003
--
-- An abstract CharSet type for Alex. To begin with we'll use Alex's
-- original definition of sets as functions, then later will
-- transition to something that will work better with Unicode.
--
-- ----------------------------------------------------------------------------}
module CharSet (
setSingleton,
Encoding(..),
Byte,
ByteSet,
byteSetSingleton,
byteRanges,
byteSetRange,
CharSet, -- abstract
emptyCharSet,
charSetSingleton,
charSet,
charSetMinus,
charSetComplement,
charSetRange,
charSetUnion,
charSetQuote,
setUnions,
byteSetToArray,
byteSetElems,
byteSetElem
) where
import Data.Array
import Data.Ranged
import Data.Word
import Data.Maybe (catMaybes)
import Data.Char (chr,ord)
import UTF8
type Byte = Word8
-- Implementation as functions
type CharSet = RSet Char
type ByteSet = RSet Byte
-- type Utf8Set = RSet [Byte]
type Utf8Range = Span [Byte]
data Encoding = Latin1 | UTF8
deriving (Eq, Show)
emptyCharSet :: CharSet
emptyCharSet = rSetEmpty
byteSetElem :: ByteSet -> Byte -> Bool
byteSetElem = rSetHas
charSetSingleton :: Char -> CharSet
charSetSingleton = rSingleton
setSingleton :: DiscreteOrdered a => a -> RSet a
setSingleton = rSingleton
charSet :: [Char] -> CharSet
charSet = setUnions . fmap charSetSingleton
charSetMinus :: CharSet -> CharSet -> CharSet
charSetMinus = rSetDifference
charSetUnion :: CharSet -> CharSet -> CharSet
charSetUnion = rSetUnion
setUnions :: DiscreteOrdered a => [RSet a] -> RSet a
setUnions = foldr rSetUnion rSetEmpty
charSetComplement :: CharSet -> CharSet
charSetComplement = rSetNegation
charSetRange :: Char -> Char -> CharSet
charSetRange c1 c2 = makeRangedSet [Range (BoundaryBelow c1) (BoundaryAbove c2)]
byteSetToArray :: ByteSet -> Array Byte Bool
byteSetToArray set = array (fst (head ass), fst (last ass)) ass
where ass = [(c,rSetHas set c) | c <- [0..0xff]]
byteSetElems :: ByteSet -> [Byte]
byteSetElems set = [c | c <- [0 .. 0xff], rSetHas set c]
charToRanges :: Encoding -> CharSet -> [Utf8Range]
charToRanges Latin1 =
map (fmap ((: []).fromIntegral.ord)) -- Span [Byte]
. catMaybes
. fmap (charRangeToCharSpan False)
. rSetRanges
charToRanges UTF8 =
concat -- Span [Byte]
. fmap toUtfRange -- [Span [Byte]]
. fmap (fmap UTF8.encode) -- Span [Byte]
. catMaybes
. fmap (charRangeToCharSpan True)
. rSetRanges
-- | Turns a range of characters expressed as a pair of UTF-8 byte sequences into a set of ranges, in which each range of the resulting set is between pairs of sequences of the same length
toUtfRange :: Span [Byte] -> [Span [Byte]]
toUtfRange (Span x y) = fix x y
fix :: [Byte] -> [Byte] -> [Span [Byte]]
fix x y
| length x == length y = [Span x y]
| length x == 1 = Span x [0x7F] : fix [0xC2,0x80] y
| length x == 2 = Span x [0xDF,0xBF] : fix [0xE0,0x80,0x80] y
| length x == 3 = Span x [0xEF,0xBF,0xBF] : fix [0xF0,0x80,0x80,0x80] y
| otherwise = error "fix: incorrect input given"
byteRangeToBytePair :: Span [Byte] -> ([Byte],[Byte])
byteRangeToBytePair (Span x y) = (x,y)
data Span a = Span a a -- lower bound inclusive, higher bound exclusive
-- (SDM: upper bound inclusive, surely?)
instance Functor Span where
fmap f (Span x y) = Span (f x) (f y)
charRangeToCharSpan :: Bool -> Range Char -> Maybe (Span Char)
charRangeToCharSpan _ (Range BoundaryAboveAll _) = Nothing
charRangeToCharSpan _ (Range (BoundaryAbove c) _) | c == maxBound = Nothing
charRangeToCharSpan _ (Range _ BoundaryBelowAll) = Nothing
charRangeToCharSpan _ (Range _ (BoundaryBelow c)) | c == minBound = Nothing
charRangeToCharSpan uni (Range x y) = Just (Span (l x) (h y))
where l b = case b of
BoundaryBelowAll -> '\0'
BoundaryBelow a -> a
BoundaryAbove a -> succ a
BoundaryAboveAll -> error "panic: charRangeToCharSpan"
h b = case b of
BoundaryBelowAll -> error "panic: charRangeToCharSpan"
BoundaryBelow a -> pred a
BoundaryAbove a -> a
BoundaryAboveAll | uni -> chr 0x10ffff
| otherwise -> chr 0xff
byteRanges :: Encoding -> CharSet -> [([Byte],[Byte])]
byteRanges enc = fmap byteRangeToBytePair . charToRanges enc
byteSetRange :: Byte -> Byte -> ByteSet
byteSetRange c1 c2 = makeRangedSet [Range (BoundaryBelow c1) (BoundaryAbove c2)]
byteSetSingleton :: Byte -> ByteSet
byteSetSingleton = rSingleton
-- TODO: More efficient generated code!
charSetQuote :: CharSet -> String
charSetQuote s = "(\\c -> " ++ foldr (\x y -> x ++ " || " ++ y) "False" (map quoteRange (rSetRanges s)) ++ ")"
where quoteRange (Range l h) = quoteL l ++ " && " ++ quoteH h
quoteL (BoundaryAbove a) = "c > " ++ show a
quoteL (BoundaryBelow a) = "c >= " ++ show a
quoteL (BoundaryAboveAll) = "False"
quoteL (BoundaryBelowAll) = "True"
quoteH (BoundaryAbove a) = "c <= " ++ show a
quoteH (BoundaryBelow a) = "c < " ++ show a
quoteH (BoundaryAboveAll) = "True"
quoteH (BoundaryBelowAll) = "False"
alex-3.2.5/src/DFA.hs 0000644 0000000 0000000 00000024441 07346545000 012365 0 ustar 00 0000000 0000000 -- -----------------------------------------------------------------------------
--
-- DFA.hs, part of Alex
--
-- (c) Chris Dornan 1995-2000, Simon Marlow 2003
--
-- This module generates a DFA from a scanner by first converting it
-- to an NFA and then converting the NFA with the subset construction.
--
-- See the chapter on `Finite Automata and Lexical Analysis' in the
-- dragon book for an excellent overview of the algorithms in this
-- module.
--
-- ----------------------------------------------------------------------------}
module DFA(scanner2dfa) where
import AbsSyn
import qualified Map
import qualified Data.IntMap as IntMap
import NFA
import Sort ( msort, nub' )
import CharSet
import Data.Array ( (!) )
import Data.Maybe ( fromJust )
{- Defined in the Scan Module
-- (This section should logically belong to the DFA module but it has been
-- placed here to make this module self-contained.)
--
-- `DFA' provides an alternative to `Scanner' (described in the RExp module);
-- it can be used directly to scan text efficiently. Additionally it has an
-- extra place holder for holding action functions for generating
-- application-specific tokens. When this place holder is not being used, the
-- unit type will be used.
--
-- Each state in the automaton consist of a list of `Accept' values, descending
-- in priority, and an array mapping characters to new states. As the array
-- may only cover a sub-range of the characters, a default state number is
-- given in the third field. By convention, all transitions to the -1 state
-- represent invalid transitions.
--
-- A list of accept states is provided for as the original specification may
-- have been ambiguous, in which case the highest priority token should be
-- taken (the one appearing earliest in the specification); this can not be
-- calculated when the DFA is generated in all cases as some of the tokens may
-- be associated with leading or trailing context or start codes.
--
-- `scan_token' (see above) can deal with unconditional accept states more
-- efficiently than those associated with context; to save it testing each time
-- whether the list of accept states contains an unconditional state, the flag
-- in the first field of `St' is set to true whenever the list contains an
-- unconditional state.
--
-- The `Accept' structure contains the priority of the token being accepted
-- (lower numbers => higher priorities), the name of the token, a place holder
-- that can be used for storing the `action' function for constructing the
-- token from the input text and thge scanner's state, a list of start codes
-- (listing the start codes that the scanner must be in for the token to be
-- accepted; empty => no restriction), the leading and trailing context (both
-- `Nothing' if there is none).
--
-- The leading context consists simply of a character predicate that will
-- return true if the last character read is acceptable. The trailing context
-- consists of an alternative starting state within the DFA; if this `sub-dfa'
-- turns up any accepting state when applied to the residual input then the
-- trailing context is acceptable (see `scan_token' above).
type DFA a = Array SNum (State a)
type SNum = Int
data State a = St Bool [Accept a] SNum (Array Char SNum)
data Accept a = Acc Int String a [StartCode] (MB(Char->Bool)) (MB SNum)
type StartCode = Int
-}
-- Scanners are converted to DFAs by converting them to NFAs first. Converting
-- an NFA to a DFA works by identifying the states of the DFA with subsets of
-- the NFA. The PartDFA is used to construct the DFA; it is essentially a DFA
-- in which the states are represented directly by state sets of the NFA.
-- `nfa2pdfa' constructs the partial DFA from the NFA by searching for all the
-- transitions from a given list of state sets, initially containing the start
-- state of the partial DFA, until all possible state sets have been considered
-- The final DFA is then constructed with a `mk_dfa'.
scanner2dfa:: Encoding -> Scanner -> [StartCode] -> DFA SNum Code
scanner2dfa enc scanner scs = nfa2dfa scs (scanner2nfa enc scanner scs)
nfa2dfa:: [StartCode] -> NFA -> DFA SNum Code
nfa2dfa scs nfa = mk_int_dfa nfa (nfa2pdfa nfa pdfa (dfa_start_states pdfa))
where
pdfa = new_pdfa n_starts nfa
n_starts = length scs -- number of start states
-- `nfa2pdfa' works by taking the next outstanding state set to be considered
-- and and ignoring it if the state is already in the partial DFA, otherwise
-- generating all possible transitions from it, adding the new state to the
-- partial DFA and continuing the closure with the extra states. Note the way
-- it incorporates the trailing context references into the search (by
-- including `rctx_ss' in the search).
nfa2pdfa:: NFA -> DFA StateSet Code -> [StateSet] -> DFA StateSet Code
nfa2pdfa _ pdfa [] = pdfa
nfa2pdfa nfa pdfa (ss:umkd)
| ss `in_pdfa` pdfa = nfa2pdfa nfa pdfa umkd
| otherwise = nfa2pdfa nfa pdfa' umkd'
where
pdfa' = add_pdfa ss (State accs (IntMap.fromList ss_outs)) pdfa
umkd' = rctx_sss ++ map snd ss_outs ++ umkd
-- for each character, the set of states that character would take
-- us to from the current set of states in the NFA.
ss_outs :: [(Int, StateSet)]
ss_outs = [ (fromIntegral ch, mk_ss nfa ss')
| ch <- byteSetElems $ setUnions [p | (p,_) <- outs],
let ss' = [ s' | (p,s') <- outs, byteSetElem p ch ],
not (null ss')
]
rctx_sss = [ mk_ss nfa [s]
| Acc _ _ _ (RightContextRExp s) <- accs ]
outs :: [(ByteSet,SNum)]
outs = [ out | s <- ss, out <- nst_outs (nfa!s) ]
accs = sort_accs [acc| s<-ss, acc<-nst_accs (nfa!s)]
-- `sort_accs' sorts a list of accept values into decending order of priority,
-- eliminating any elements that follow an unconditional accept value.
sort_accs:: [Accept a] -> [Accept a]
sort_accs accs = foldr chk [] (msort le accs)
where
chk acc@(Acc _ _ Nothing NoRightContext) _ = [acc]
chk acc rst = acc:rst
le (Acc{accPrio = n}) (Acc{accPrio=n'}) = n<=n'
{------------------------------------------------------------------------------
State Sets and Partial DFAs
------------------------------------------------------------------------------}
-- A `PartDFA' is a partially constructed DFA in which the states are
-- represented by sets of states of the original NFA. It is represented by a
-- triple consisting of the start state of the partial DFA, the NFA from which
-- it is derived and a map from state sets to states of the partial DFA. The
-- state set for a given list of NFA states is calculated by taking the epsilon
-- closure of all the states, sorting the result with duplicates eliminated.
type StateSet = [SNum]
new_pdfa:: Int -> NFA -> DFA StateSet a
new_pdfa starts nfa
= DFA { dfa_start_states = start_ss,
dfa_states = Map.empty
}
where
start_ss = [ msort (<=) (nst_cl(nfa!n)) | n <- [0..(starts-1)]]
-- starts is the number of start states
-- constructs the epsilon-closure of a set of NFA states
mk_ss:: NFA -> [SNum] -> StateSet
mk_ss nfa l = nub' (<=) [s'| s<-l, s'<-nst_cl(nfa!s)]
add_pdfa:: StateSet -> State StateSet a -> DFA StateSet a -> DFA StateSet a
add_pdfa ss pst (DFA st mp) = DFA st (Map.insert ss pst mp)
in_pdfa:: StateSet -> DFA StateSet a -> Bool
in_pdfa ss (DFA _ mp) = ss `Map.member` mp
-- Construct a DFA with numbered states, from a DFA whose states are
-- sets of states from the original NFA.
mk_int_dfa:: NFA -> DFA StateSet a -> DFA SNum a
mk_int_dfa nfa (DFA start_states mp)
= DFA [0 .. length start_states-1]
(Map.fromList [ (lookup' st, cnv pds) | (st, pds) <- Map.toAscList mp ])
where
mp' = Map.fromList (zip (start_states ++
(map fst . Map.toAscList) (foldr Map.delete mp start_states)) [0..])
lookup' = fromJust . flip Map.lookup mp'
cnv :: State StateSet a -> State SNum a
cnv (State accs as) = State accs' as'
where
as' = IntMap.mapWithKey (\_ch s -> lookup' s) as
accs' = map cnv_acc accs
cnv_acc (Acc p a lctx rctx) = Acc p a lctx rctx'
where rctx' =
case rctx of
RightContextRExp s ->
RightContextRExp (lookup' (mk_ss nfa [s]))
other -> other
{-
-- `mk_st' constructs a state node from the list of accept values and a list of
-- transitions. The transitions list all the valid transitions out of the
-- node; all invalid transitions should be represented in the array by state
-- -1. `mk_st' has to work out whether the accept states contain an
-- unconditional entry, in which case the first field of `St' should be true,
-- and which default state to use in constructing the array (the array may span
-- a sub-range of the character set, the state number given the third argument
-- of `St' being taken as the default if an input character lies outside the
-- range). The default values is chosen to minimise the bounds of the array
-- and so there are two candidates: the value that 0 maps to (in which case
-- some initial segment of the array may be omitted) or the value that 255 maps
-- to (in which case a final segment of the array may be omitted), hence the
-- calculation of `(df,bds)'.
--
-- Note that empty arrays are avoided as they can cause severe problems for
-- some popular Haskell compilers.
mk_st:: [Accept Code] -> [(Char,Int)] -> State Code
mk_st accs as =
if null as
then St accs (-1) (listArray ('0','0') [-1])
else St accs df (listArray bds [arr!c| c<-range bds])
where
bds = if sz==0 then ('0','0') else bds0
(sz,df,bds0) | sz1 < sz2 = (sz1,df1,bds1)
| otherwise = (sz2,df2,bds2)
(sz1,df1,bds1) = mk_bds(arr!chr 0)
(sz2,df2,bds2) = mk_bds(arr!chr 255)
mk_bds df = (t-b, df, (chr b, chr (255-t)))
where
b = length (takeWhile id [arr!c==df| c<-['\0'..'\xff']])
t = length (takeWhile id [arr!c==df| c<-['\xff','\xfe'..'\0']])
arr = listArray ('\0','\xff') (take 256 (repeat (-1))) // as
-}
alex-3.2.5/src/DFAMin.hs 0000644 0000000 0000000 00000012273 07346545000 013031 0 ustar 00 0000000 0000000 {-# OPTIONS_GHC -fno-warn-name-shadowing #-}
{-# LANGUAGE PatternGuards #-}
module DFAMin (minimizeDFA) where
import AbsSyn
import Data.Map (Map)
import qualified Data.Map as Map
import Data.IntSet (IntSet)
import qualified Data.IntSet as IS
import Data.IntMap (IntMap)
import qualified Data.IntMap as IM
import Data.List as List
-- Hopcroft's Algorithm for DFA minimization (cut/pasted from Wikipedia):
-- P := {{all accepting states}, {all nonaccepting states}};
-- Q := {{all accepting states}};
-- while (Q is not empty) do
-- choose and remove a set A from Q
-- for each c in ∑ do
-- let X be the set of states for which a transition on c leads to a state in A
-- for each set Y in P for which X ∩ Y is nonempty do
-- replace Y in P by the two sets X ∩ Y and Y \ X
-- if Y is in Q
-- replace Y in Q by the same two sets
-- else
-- add the smaller of the two sets to Q
-- end;
-- end;
-- end;
minimizeDFA :: Ord a => DFA Int a -> DFA Int a
minimizeDFA dfa@ DFA { dfa_start_states = starts,
dfa_states = statemap
}
= DFA { dfa_start_states = starts,
dfa_states = Map.fromList states }
where
equiv_classes = groupEquivStates dfa
numbered_states = number (length starts) equiv_classes
-- assign each state in the minimized DFA a number, making
-- sure that we assign the numbers [0..] to the start states.
number _ [] = []
number n (ss:sss) =
case filter (`IS.member` ss) starts of
[] -> (n,ss) : number (n+1) sss
starts' -> zip starts' (repeat ss) ++ number n sss
-- if one of the states of the minimized DFA corresponds
-- to multiple starts states, we just have to duplicate
-- that state.
states = [
let old_states = map (lookup statemap) (IS.toList equiv)
accs = map fix_acc (state_acc (head old_states))
-- accepts should all be the same
out = IM.fromList [ (b, get_new old)
| State _ out <- old_states,
(b,old) <- IM.toList out ]
in (n, State accs out)
| (n, equiv) <- numbered_states
]
fix_acc acc = acc { accRightCtx = fix_rctxt (accRightCtx acc) }
fix_rctxt (RightContextRExp s) = RightContextRExp (get_new s)
fix_rctxt other = other
lookup m k = Map.findWithDefault (error "minimizeDFA") k m
get_new = lookup old_to_new
old_to_new :: Map Int Int
old_to_new = Map.fromList [ (s,n) | (n,ss) <- numbered_states,
s <- IS.toList ss ]
groupEquivStates :: (Ord a) => DFA Int a -> [IntSet]
groupEquivStates DFA { dfa_states = statemap }
= go init_p init_q
where
(accepting, nonaccepting) = Map.partition acc statemap
where acc (State as _) = not (List.null as)
nonaccepting_states = IS.fromList (Map.keys nonaccepting)
-- group the accepting states into equivalence classes
accept_map = {-# SCC "accept_map" #-}
foldl' (\m (n,s) -> Map.insertWith (++) (state_acc s) [n] m)
Map.empty
(Map.toList accepting)
-- accept_groups :: Ord s => [Set s]
accept_groups = map IS.fromList (Map.elems accept_map)
init_p = nonaccepting_states : accept_groups
init_q = accept_groups
-- map token T to
-- a map from state S to the list of states that transition to
-- S on token T
-- This is a cache of the information needed to compute x below
bigmap :: IntMap (IntMap [SNum])
bigmap = IM.fromListWith (IM.unionWith (++))
[ (i, IM.singleton to [from])
| (from, state) <- Map.toList statemap,
(i,to) <- IM.toList (state_out state) ]
-- incoming I A = the set of states that transition to a state in
-- A on token I.
incoming :: Int -> IntSet -> IntSet
incoming i a = IS.fromList (concat ss)
where
map1 = IM.findWithDefault IM.empty i bigmap
ss = [ IM.findWithDefault [] s map1
| s <- IS.toList a ]
-- The outer loop: recurse on each set in Q
go p [] = p
go p (a:q) = go1 0 p q
where
-- recurse on each token (0..255)
go1 256 p q = go p q
go1 i p q = go1 (i+1) p' q'
where
(p',q') = go2 p [] q
x = incoming i a
-- recurse on each set in P
go2 [] p' q = (p',q)
go2 (y:p) p' q
| IS.null i || IS.null d = go2 p (y:p') q
| otherwise = go2 p (i:d:p') q1
where
i = IS.intersection x y
d = IS.difference y x
q1 = replaceyin q
where
replaceyin [] =
if IS.size i < IS.size d then [i] else [d]
replaceyin (z:zs)
| z == y = i : d : zs
| otherwise = z : replaceyin zs
alex-3.2.5/src/DFS.hs 0000644 0000000 0000000 00000010343 07346545000 012403 0 ustar 00 0000000 0000000 {------------------------------------------------------------------------------
DFS
This module is a portable version of the ghc-specific `DFS.g.hs', which is
itself a straightforward encoding of the Launchbury/King paper on linear graph
algorithms. This module uses balanced binary trees instead of mutable arrays
to implement the depth-first search so the complexity of the algorithms is
n.log(n) instead of linear.
The vertices of the graphs manipulated by these modules are labelled with the
integers from 0 to n-1 where n is the number of vertices in the graph.
The module's principle products are `mk_graph' for constructing a graph from an
edge list, `t_close' for taking the transitive closure of a graph and `scc'
for generating a list of strongly connected components; the components are
listed in dependency order and each component takes the form of a `dfs tree'
(see Launchberry and King). Thus if each edge (fid,fid') encodes the fact that
function `fid' references function `fid'' in a program then `scc' performs a
dependency analysis.
Chris Dornan, 23-Jun-94, 2-Jul-96, 29-Aug-96, 29-Sep-97
------------------------------------------------------------------------------}
module DFS where
import Set ( Set )
import qualified Set hiding ( Set )
import Data.Array ( (!), accumArray, listArray )
-- The result of a depth-first search of a graph is a list of trees,
-- `GForrest'. `post_order' provides a post-order traversal of a forrest.
type GForrest = [GTree]
data GTree = GNode Int GForrest
postorder:: GForrest -> [Int]
postorder ts = po ts []
where
po ts' l = foldr po_tree l ts'
po_tree (GNode a ts') l = po ts' (a:l)
list_tree:: GTree -> [Int]
list_tree t = l_t t []
where
l_t (GNode x ts) l = foldr l_t (x:l) ts
-- Graphs are represented by a pair of an integer, giving the number of nodes
-- in the graph, and function mapping each vertex (0..n-1, n=size of graph) to
-- its neighbouring nodes. `mk_graph' takes a size and an edge list and
-- constructs a graph.
type Graph = (Int,Int->[Int])
type Edge = (Int,Int)
mk_graph:: Int -> [Edge] -> Graph
mk_graph sz es = (sz,\v->ar!v)
where
ar = accumArray (flip (:)) [] (0,sz-1) [(v,v')| (v,v')<-es]
vertices:: Graph -> [Int]
vertices (sz,_) = [0..sz-1]
out:: Graph -> Int -> [Int]
out (_,f) = f
edges:: Graph -> [Edge]
edges g = [(v,v')| v<-vertices g, v'<-out g v]
rev_edges:: Graph -> [Edge]
rev_edges g = [(v',v)| v<-vertices g, v'<-out g v]
reverse_graph:: Graph -> Graph
reverse_graph g@(sz,_) = mk_graph sz (rev_edges g)
-- `t_close' takes the transitive closure of a graph; `scc' returns the stronly
-- connected components of the graph and `top_sort' topologically sorts the
-- graph. Note that the array is given one more element in order to avoid
-- problems with empty arrays.
t_close:: Graph -> Graph
t_close g@(sz,_) = (sz,\v->ar!v)
where
ar = listArray (0,sz) ([postorder(dff' [v] g)| v<-vertices g]++[und])
und = error "t_close"
scc:: Graph -> GForrest
scc g = dff' (reverse (top_sort (reverse_graph g))) g
top_sort:: Graph -> [Int]
top_sort = postorder . dff
-- `dff' computes the depth-first forrest. It works by unrolling the
-- potentially infinite tree from each of the vertices with `generate_g' and
-- then pruning out the duplicates.
dff:: Graph -> GForrest
dff g = dff' (vertices g) g
dff':: [Int] -> Graph -> GForrest
dff' vs (_bs, f) = prune (map (generate_g f) vs)
generate_g:: (Int->[Int]) -> Int -> GTree
generate_g f v = GNode v (map (generate_g f) (f v))
prune:: GForrest -> GForrest
prune ts = snd(chop(empty_int,ts))
where
empty_int:: Set Int
empty_int = Set.empty
chop:: (Set Int,GForrest) -> (Set Int,GForrest)
chop p@(_, []) = p
chop (vstd,GNode v ts:us) =
if v `Set.member` vstd
then chop (vstd,us)
else let vstd1 = Set.insert v vstd
(vstd2,ts') = chop (vstd1,ts)
(vstd3,us') = chop (vstd2,us)
in
(vstd3,GNode v ts' : us')
{-- Some simple test functions
test:: Graph Char
test = mk_graph (char_bds ('a','h')) (mk_pairs "eefggfgegdhfhged")
where
mk_pairs [] = []
mk_pairs (a:b:l) = (a,b):mk_pairs l
-}
alex-3.2.5/src/Data/ 0000755 0000000 0000000 00000000000 07346545000 012303 5 ustar 00 0000000 0000000 alex-3.2.5/src/Data/Ranged.hs 0000644 0000000 0000000 00000000323 07346545000 014035 0 ustar 00 0000000 0000000 module Data.Ranged (
module Data.Ranged.Boundaries,
module Data.Ranged.Ranges,
module Data.Ranged.RangedSet
) where
import Data.Ranged.Boundaries
import Data.Ranged.Ranges
import Data.Ranged.RangedSet
alex-3.2.5/src/Data/Ranged/ 0000755 0000000 0000000 00000000000 07346545000 013503 5 ustar 00 0000000 0000000 alex-3.2.5/src/Data/Ranged/Boundaries.hs 0000644 0000000 0000000 00000015267 07346545000 016145 0 ustar 00 0000000 0000000 -----------------------------------------------------------------------------
-- |
-- Module : Data.Ranged.Boundaries
-- Copyright : (c) Paul Johnson 2006
-- License : BSD-style
-- Maintainer : paul@cogito.org.uk
-- Stability : experimental
-- Portability : portable
--
-----------------------------------------------------------------------------
module Data.Ranged.Boundaries (
DiscreteOrdered (..),
enumAdjacent,
boundedAdjacent,
boundedBelow,
Boundary (..),
above,
(/>/)
) where
import Data.Ratio
import Data.Word
infix 4 />/
{- |
Distinguish between dense and sparse ordered types. A dense type is
one in which any two values @v1 < v2@ have a third value @v3@ such that
@v1 < v3 < v2@.
In theory the floating types are dense, although in practice they can only have
finitely many values. This class treats them as dense.
Tuples up to 4 members are declared as instances. Larger tuples may be added
if necessary.
Most values of sparse types have an @adjacentBelow@, such that, for all x:
> case adjacentBelow x of
> Just x1 -> adjacent x1 x
> Nothing -> True
The exception is for bounded types when @x == lowerBound@. For dense types
@adjacentBelow@ always returns 'Nothing'.
This approach was suggested by Ben Rudiak-Gould on comp.lang.functional.
-}
class Ord a => DiscreteOrdered a where
-- | Two values @x@ and @y@ are adjacent if @x < y@ and there does not
-- exist a third value between them. Always @False@ for dense types.
adjacent :: a -> a -> Bool
-- | The value immediately below the argument, if it can be determined.
adjacentBelow :: a -> Maybe a
-- Implementation note: the precise rules about unbounded enumerated vs
-- bounded enumerated types are difficult to express using Haskell 98, so
-- the prelude types are listed individually here.
instance DiscreteOrdered Bool where
adjacent = boundedAdjacent
adjacentBelow = boundedBelow
instance DiscreteOrdered Ordering where
adjacent = boundedAdjacent
adjacentBelow = boundedBelow
instance DiscreteOrdered Char where
adjacent = boundedAdjacent
adjacentBelow = boundedBelow
instance DiscreteOrdered Int where
adjacent = boundedAdjacent
adjacentBelow = boundedBelow
instance DiscreteOrdered Integer where
adjacent = enumAdjacent
adjacentBelow = Just . pred
instance DiscreteOrdered Double where
adjacent _ _ = False
adjacentBelow = const Nothing
instance DiscreteOrdered Float where
adjacent _ _ = False
adjacentBelow = const Nothing
instance (Integral a) => DiscreteOrdered (Ratio a) where
adjacent _ _ = False
adjacentBelow = const Nothing
instance Ord a => DiscreteOrdered [a] where
adjacent _ _ = False
adjacentBelow = const Nothing
instance (Ord a, DiscreteOrdered b) => DiscreteOrdered (a, b)
where
adjacent (x1, x2) (y1, y2) = (x1 == y1) && adjacent x2 y2
adjacentBelow (x1, x2) = do -- Maybe monad
x2' <- adjacentBelow x2
return (x1, x2')
instance (Ord a, Ord b, DiscreteOrdered c) => DiscreteOrdered (a, b, c)
where
adjacent (x1, x2, x3) (y1, y2, y3) =
(x1 == y1) && (x2 == y2) && adjacent x3 y3
adjacentBelow (x1, x2, x3) = do -- Maybe monad
x3' <- adjacentBelow x3
return (x1, x2, x3')
instance (Ord a, Ord b, Ord c, DiscreteOrdered d) =>
DiscreteOrdered (a, b, c, d)
where
adjacent (x1, x2, x3, x4) (y1, y2, y3, y4) =
(x1 == y1) && (x2 == y2) && (x3 == y3) && adjacent x4 y4
adjacentBelow (x1, x2, x3, x4) = do -- Maybe monad
x4' <- adjacentBelow x4
return (x1, x2, x3, x4')
instance DiscreteOrdered Word8 where
adjacent x y = x + 1 == y
adjacentBelow 0 = Nothing
adjacentBelow x = Just (x-1)
-- | Check adjacency for sparse enumerated types (i.e. where there
-- is no value between @x@ and @succ x@).
enumAdjacent :: (Ord a, Enum a) => a -> a -> Bool
enumAdjacent x y = (succ x == y)
-- | Check adjacency, allowing for case where x = maxBound. Use as the
-- definition of "adjacent" for bounded enumerated types such as Int and Char.
boundedAdjacent :: (Ord a, Enum a) => a -> a -> Bool
boundedAdjacent x y = if x < y then succ x == y else False
-- | The usual implementation of 'adjacentBelow' for bounded enumerated types.
boundedBelow :: (Eq a, Enum a, Bounded a) => a -> Maybe a
boundedBelow x = if x == minBound then Nothing else Just $ pred x
{- |
A Boundary is a division of an ordered type into values above
and below the boundary. No value can sit on a boundary.
Known bug: for Bounded types
* @BoundaryAbove maxBound < BoundaryAboveAll@
* @BoundaryBelow minBound > BoundaryBelowAll@
This is incorrect because there are no possible values in
between the left and right sides of these inequalities.
-}
data Boundary a =
-- | The argument is the highest value below the boundary.
BoundaryAbove a |
-- | The argument is the lowest value above the boundary.
BoundaryBelow a |
-- | The boundary above all values.
BoundaryAboveAll |
-- | The boundary below all values.
BoundaryBelowAll
deriving (Show)
-- | True if the value is above the boundary, false otherwise.
above :: Ord v => Boundary v -> v -> Bool
above (BoundaryAbove b) v = v > b
above (BoundaryBelow b) v = v >= b
above BoundaryAboveAll _ = False
above BoundaryBelowAll _ = True
-- | Same as 'above', but with the arguments reversed for more intuitive infix
-- usage.
(/>/) :: Ord v => v -> Boundary v -> Bool
(/>/) = flip above
instance (DiscreteOrdered a) => Eq (Boundary a) where
b1 == b2 = compare b1 b2 == EQ
instance (DiscreteOrdered a) => Ord (Boundary a) where
-- Comparison alogrithm based on brute force and ignorance:
-- enumerate all combinations.
compare boundary1 boundary2 =
case boundary1 of
BoundaryAbove b1 ->
case boundary2 of
BoundaryAbove b2 -> compare b1 b2
BoundaryBelow b2 ->
if b1 < b2
then
if adjacent b1 b2 then EQ else LT
else GT
BoundaryAboveAll -> LT
BoundaryBelowAll -> GT
BoundaryBelow b1 ->
case boundary2 of
BoundaryAbove b2 ->
if b1 > b2
then
if adjacent b2 b1 then EQ else GT
else LT
BoundaryBelow b2 -> compare b1 b2
BoundaryAboveAll -> LT
BoundaryBelowAll -> GT
BoundaryAboveAll ->
case boundary2 of
BoundaryAboveAll -> EQ
_ -> GT
BoundaryBelowAll ->
case boundary2 of
BoundaryBelowAll -> EQ
_ -> LT
alex-3.2.5/src/Data/Ranged/RangedSet.hs 0000644 0000000 0000000 00000015745 07346545000 015727 0 ustar 00 0000000 0000000 module Data.Ranged.RangedSet (
-- ** Ranged Set Type
RSet,
rSetRanges,
-- ** Ranged Set construction functions and their preconditions
makeRangedSet,
unsafeRangedSet,
validRangeList,
normaliseRangeList,
rSingleton,
rSetUnfold,
-- ** Predicates
rSetIsEmpty,
rSetIsFull,
(-?-), rSetHas,
(-<=-), rSetIsSubset,
(-<-), rSetIsSubsetStrict,
-- ** Set Operations
(-\/-), rSetUnion,
(-/\-), rSetIntersection,
(-!-), rSetDifference,
rSetNegation,
-- ** Useful Sets
rSetEmpty,
rSetFull,
) where
import Data.Ranged.Boundaries
import Data.Ranged.Ranges
#if __GLASGOW_HASKELL__ >= 800
import Data.Semigroup
#elif __GLASGOW_HASKELL__ < 710
import Data.Monoid
#endif
import Data.List
infixl 7 -/\-
infixl 6 -\/-, -!-
infixl 5 -<=-, -<-, -?-
-- | An RSet (for Ranged Set) is a list of ranges. The ranges must be sorted
-- and not overlap.
newtype DiscreteOrdered v => RSet v = RSet {rSetRanges :: [Range v]}
deriving (Eq, Show, Ord)
#if __GLASGOW_HASKELL__ >= 800
instance DiscreteOrdered a => Semigroup (RSet a) where
(<>) = rSetUnion
instance DiscreteOrdered a => Monoid (RSet a) where
mappend = (<>)
mempty = rSetEmpty
#else
instance DiscreteOrdered a => Monoid (RSet a) where
mappend = rSetUnion
mempty = rSetEmpty
#endif
-- | Determine if the ranges in the list are both in order and non-overlapping.
-- If so then they are suitable input for the unsafeRangedSet function.
validRangeList :: DiscreteOrdered v => [Range v] -> Bool
validRangeList [] = True
validRangeList [Range lower upper] = lower <= upper
validRangeList rs = and $ zipWith okAdjacent rs (tail rs)
where
okAdjacent (Range lower1 upper1) (Range lower2 upper2) =
lower1 <= upper1 && upper1 <= lower2 && lower2 <= upper2
-- | Rearrange and merge the ranges in the list so that they are in order and
-- non-overlapping.
normaliseRangeList :: DiscreteOrdered v => [Range v] -> [Range v]
normaliseRangeList = normalise . sort . filter (not . rangeIsEmpty)
-- Private routine: normalise a range list that is known to be already sorted.
-- This precondition is not checked.
normalise :: DiscreteOrdered v => [Range v] -> [Range v]
normalise (r1:r2:rs) =
if overlap r1 r2
then normalise $
Range (rangeLower r1)
(max (rangeUpper r1) (rangeUpper r2))
: rs
else r1 : (normalise $ r2 : rs)
where
overlap (Range _ upper1) (Range lower2 _) = upper1 >= lower2
normalise rs = rs
-- | Create a new Ranged Set from a list of ranges. The list may contain
-- ranges that overlap or are not in ascending order.
makeRangedSet :: DiscreteOrdered v => [Range v] -> RSet v
makeRangedSet = RSet . normaliseRangeList
-- | Create a new Ranged Set from a list of ranges. @validRangeList ranges@
-- must return @True@. This precondition is not checked.
unsafeRangedSet :: DiscreteOrdered v => [Range v] -> RSet v
unsafeRangedSet = RSet
-- | Create a Ranged Set from a single element.
rSingleton :: DiscreteOrdered v => v -> RSet v
rSingleton v = unsafeRangedSet [singletonRange v]
-- | True if the set has no members.
rSetIsEmpty :: DiscreteOrdered v => RSet v -> Bool
rSetIsEmpty = null . rSetRanges
-- | True if the negation of the set has no members.
rSetIsFull :: DiscreteOrdered v => RSet v -> Bool
rSetIsFull = rSetIsEmpty . rSetNegation
-- | True if the value is within the ranged set. Infix precedence is left 5.
rSetHas, (-?-) :: DiscreteOrdered v => RSet v -> v -> Bool
rSetHas (RSet ls) value = rSetHas1 ls
where
rSetHas1 [] = False
rSetHas1 (r:rs)
| value />/ rangeLower r = rangeHas r value || rSetHas1 rs
| otherwise = False
(-?-) = rSetHas
-- | True if the first argument is a subset of the second argument, or is
-- equal.
--
-- Infix precedence is left 5.
rSetIsSubset, (-<=-) :: DiscreteOrdered v => RSet v -> RSet v -> Bool
rSetIsSubset rs1 rs2 = rSetIsEmpty (rs1 -!- rs2)
(-<=-) = rSetIsSubset
-- | True if the first argument is a strict subset of the second argument.
--
-- Infix precedence is left 5.
rSetIsSubsetStrict, (-<-) :: DiscreteOrdered v => RSet v -> RSet v -> Bool
rSetIsSubsetStrict rs1 rs2 =
rSetIsEmpty (rs1 -!- rs2)
&& not (rSetIsEmpty (rs2 -!- rs1))
(-<-) = rSetIsSubsetStrict
-- | Set union for ranged sets. Infix precedence is left 6.
rSetUnion, (-\/-) :: DiscreteOrdered v => RSet v -> RSet v -> RSet v
-- Implementation note: rSetUnion merges the two lists into a single
-- sorted list and then calls normalise to combine overlapping ranges.
rSetUnion (RSet ls1) (RSet ls2) = RSet $ normalise $ merge ls1 ls2
where
merge ms1 [] = ms1
merge [] ms2 = ms2
merge ms1@(h1:t1) ms2@(h2:t2) =
if h1 < h2
then h1 : merge t1 ms2
else h2 : merge ms1 t2
(-\/-) = rSetUnion
-- | Set intersection for ranged sets. Infix precedence is left 7.
rSetIntersection, (-/\-) :: DiscreteOrdered v => RSet v -> RSet v -> RSet v
rSetIntersection (RSet ls1) (RSet ls2) =
RSet $ filter (not . rangeIsEmpty) $ merge ls1 ls2
where
merge ms1@(h1:t1) ms2@(h2:t2) =
rangeIntersection h1 h2
: if rangeUpper h1 < rangeUpper h2
then merge t1 ms2
else merge ms1 t2
merge _ _ = []
(-/\-) = rSetIntersection
-- | Set difference. Infix precedence is left 6.
rSetDifference, (-!-) :: DiscreteOrdered v => RSet v -> RSet v -> RSet v
rSetDifference rs1 rs2 = rs1 -/\- (rSetNegation rs2)
(-!-) = rSetDifference
-- | Set negation.
rSetNegation :: DiscreteOrdered a => RSet a -> RSet a
rSetNegation set = RSet $ ranges1 $ setBounds1
where
ranges1 (b1:b2:bs) = Range b1 b2 : ranges1 bs
ranges1 [BoundaryAboveAll] = []
ranges1 [b] = [Range b BoundaryAboveAll]
ranges1 _ = []
setBounds1 = case setBounds of
(BoundaryBelowAll : bs) -> bs
_ -> BoundaryBelowAll : setBounds
setBounds = bounds $ rSetRanges set
bounds (r:rs) = rangeLower r : rangeUpper r : bounds rs
bounds _ = []
-- | The empty set.
rSetEmpty :: DiscreteOrdered a => RSet a
rSetEmpty = RSet []
-- | The set that contains everything.
rSetFull :: DiscreteOrdered a => RSet a
rSetFull = RSet [Range BoundaryBelowAll BoundaryAboveAll]
-- | Construct a range set.
rSetUnfold :: DiscreteOrdered a =>
Boundary a
-- ^ A first lower boundary.
-> (Boundary a -> Boundary a)
-- ^ A function from a lower boundary to an upper boundary, which must
-- return a result greater than the argument (not checked).
-> (Boundary a -> Maybe (Boundary a))
-- ^ A function from a lower boundary to @Maybe@ the successor lower
-- boundary, which must return a result greater than the argument
-- (not checked). If ranges overlap then they will be merged.
-> RSet a
rSetUnfold bound upperFunc succFunc = RSet $ normalise $ ranges1 bound
where
ranges1 b =
Range b (upperFunc b)
: case succFunc b of
Just b2 -> ranges1 b2
Nothing -> []
alex-3.2.5/src/Data/Ranged/Ranges.hs 0000644 0000000 0000000 00000014675 07346545000 015273 0 ustar 00 0000000 0000000 -----------------------------------------------------------------------------
--
-- Module : Data.Ranged.Ranges
-- Copyright : (c) Paul Johnson 2006
-- License : BSD-style
-- Maintainer : paul@cogito.org.uk
-- Stability : experimental
-- Portability : portable
--
-----------------------------------------------------------------------------
-- | A range has an upper and lower boundary.
module Data.Ranged.Ranges (
-- ** Construction
Range (..),
emptyRange,
fullRange,
-- ** Predicates
rangeIsEmpty,
rangeIsFull,
rangeOverlap,
rangeEncloses,
rangeSingletonValue,
-- ** Membership
rangeHas,
rangeListHas,
-- ** Set Operations
singletonRange,
rangeIntersection,
rangeUnion,
rangeDifference,
) where
import Data.Ranged.Boundaries
-- | A Range has upper and lower boundaries.
data Range v = Range {rangeLower, rangeUpper :: Boundary v}
instance (DiscreteOrdered a) => Eq (Range a) where
r1 == r2 = (rangeIsEmpty r1 && rangeIsEmpty r2) ||
(rangeLower r1 == rangeLower r2 &&
rangeUpper r1 == rangeUpper r2)
instance (DiscreteOrdered a) => Ord (Range a) where
compare r1 r2
| r1 == r2 = EQ
| rangeIsEmpty r1 = LT
| rangeIsEmpty r2 = GT
| otherwise = compare (rangeLower r1, rangeUpper r1)
(rangeLower r2, rangeUpper r2)
instance (Show a, DiscreteOrdered a) => Show (Range a) where
show r
| rangeIsEmpty r = "Empty"
| rangeIsFull r = "All x"
| otherwise =
case rangeSingletonValue r of
Just v -> "x == " ++ show v
Nothing -> lowerBound ++ "x" ++ upperBound
where
lowerBound = case rangeLower r of
BoundaryBelowAll -> ""
BoundaryBelow v -> show v ++ " <= "
BoundaryAbove v -> show v ++ " < "
BoundaryAboveAll -> error "show Range: lower bound is BoundaryAboveAll"
upperBound = case rangeUpper r of
BoundaryBelowAll -> error "show Range: upper bound is BoundaryBelowAll"
BoundaryBelow v -> " < " ++ show v
BoundaryAbove v -> " <= " ++ show v
BoundaryAboveAll -> ""
-- | True if the value is within the range.
rangeHas :: Ord v => Range v -> v -> Bool
rangeHas (Range b1 b2) v =
(v />/ b1) && not (v />/ b2)
-- | True if the value is within one of the ranges.
rangeListHas :: Ord v =>
[Range v] -> v -> Bool
rangeListHas ls v = or $ map (\r -> rangeHas r v) ls
-- | The empty range
emptyRange :: Range v
emptyRange = Range BoundaryAboveAll BoundaryBelowAll
-- | The full range. All values are within it.
fullRange :: Range v
fullRange = Range BoundaryBelowAll BoundaryAboveAll
-- | A range containing a single value
singletonRange :: v -> Range v
singletonRange v = Range (BoundaryBelow v) (BoundaryAbove v)
-- | If the range is a singleton, returns @Just@ the value. Otherwise returns
-- @Nothing@.
--
-- Known bug: This always returns @Nothing@ for ranges including
-- @BoundaryBelowAll@ or @BoundaryAboveAll@. For bounded types this can be
-- incorrect. For instance, the following range only contains one value:
--
-- > Range (BoundaryBelow maxBound) BoundaryAboveAll
rangeSingletonValue :: DiscreteOrdered v => Range v -> Maybe v
rangeSingletonValue (Range (BoundaryBelow v1) (BoundaryBelow v2))
| adjacent v1 v2 = Just v1
| otherwise = Nothing
rangeSingletonValue (Range (BoundaryBelow v1) (BoundaryAbove v2))
| v1 == v2 = Just v1
| otherwise = Nothing
rangeSingletonValue (Range (BoundaryAbove v1) (BoundaryBelow v2)) =
do
v2' <- adjacentBelow v2
v2'' <- adjacentBelow v2'
if v1 == v2'' then return v2' else Nothing
rangeSingletonValue (Range (BoundaryAbove v1) (BoundaryAbove v2))
| adjacent v1 v2 = Just v2
| otherwise = Nothing
rangeSingletonValue (Range _ _) = Nothing
-- | A range is empty unless its upper boundary is greater than its lower
-- boundary.
rangeIsEmpty :: DiscreteOrdered v => Range v -> Bool
rangeIsEmpty (Range lower upper) = upper <= lower
-- | A range is full if it contains every possible value.
rangeIsFull :: DiscreteOrdered v => Range v -> Bool
rangeIsFull = (== fullRange)
-- | Two ranges overlap if their intersection is non-empty.
rangeOverlap :: DiscreteOrdered v => Range v -> Range v -> Bool
rangeOverlap r1 r2 =
not (rangeIsEmpty r1)
&& not (rangeIsEmpty r2)
&& not (rangeUpper r1 <= rangeLower r2 || rangeUpper r2 <= rangeLower r1)
-- | The first range encloses the second if every value in the second range is
-- also within the first range. If the second range is empty then this is
-- always true.
rangeEncloses :: DiscreteOrdered v => Range v -> Range v -> Bool
rangeEncloses r1 r2 =
(rangeLower r1 <= rangeLower r2 && rangeUpper r2 <= rangeUpper r1)
|| rangeIsEmpty r2
-- | Intersection of two ranges, if any.
rangeIntersection :: DiscreteOrdered v => Range v -> Range v -> Range v
rangeIntersection r1@(Range lower1 upper1) r2@(Range lower2 upper2)
| rangeIsEmpty r1 || rangeIsEmpty r2 = emptyRange
| otherwise = Range (max lower1 lower2) (min upper1 upper2)
-- | Union of two ranges. Returns one or two results.
--
-- If there are two results then they are guaranteed to have a non-empty
-- gap in between, but may not be in ascending order.
rangeUnion :: DiscreteOrdered v => Range v -> Range v -> [Range v]
rangeUnion r1@(Range lower1 upper1) r2@(Range lower2 upper2)
| rangeIsEmpty r1 = [r2]
| rangeIsEmpty r2 = [r1]
| otherwise =
if touching then [Range lower upper] else [r1, r2]
where
touching = (max lower1 lower2) <= (min upper1 upper2)
lower = min lower1 lower2
upper = max upper1 upper2
-- | @range1@ minus @range2@. Returns zero, one or two results. Multiple
-- results are guaranteed to have non-empty gaps in between, but may not be in
-- ascending order.
rangeDifference :: DiscreteOrdered v => Range v -> Range v -> [Range v]
rangeDifference r1@(Range lower1 upper1) (Range lower2 upper2) =
-- There are six possibilities
-- 1: r2 completely less than r1
-- 2: r2 overlaps bottom of r1
-- 3: r2 encloses r1
-- 4: r1 encloses r2
-- 5: r2 overlaps top of r1
-- 6: r2 completely greater than r1
if intersects
then -- Cases 2,3,4,5
filter (not . rangeIsEmpty) [Range lower1 lower2, Range upper2 upper1]
else -- Cases 1, 6
[r1]
where
intersects = (max lower1 lower2) < (min upper1 upper2)
alex-3.2.5/src/Info.hs 0000644 0000000 0000000 00000003425 07346545000 012665 0 ustar 00 0000000 0000000 -- -----------------------------------------------------------------------------
--
-- Info.hs, part of Alex
--
-- (c) Simon Marlow 2003
--
-- Generate a human-readable rendition of the state machine.
--
-- ----------------------------------------------------------------------------}
module Info (infoDFA) where
import AbsSyn
import qualified Map
import qualified Data.IntMap as IntMap
import Util
-- -----------------------------------------------------------------------------
-- Generate a human readable dump of the state machine
infoDFA :: Int -> String -> DFA SNum Code -> ShowS
infoDFA _ func_nm dfa
= str "Scanner : " . str func_nm . nl
. str "States : " . shows (length dfa_list) . nl
. nl . infoDFA'
where
dfa_list = Map.toAscList (dfa_states dfa)
infoDFA' = interleave_shows nl (map infoStateN dfa_list)
infoStateN (i,s) = str "State " . shows i . nl . infoState s
infoState :: State SNum Code -> ShowS
infoState (State accs out)
= foldr (.) id (map infoAccept accs)
. infoArr out . nl
infoArr out
= char '\t' . interleave_shows (str "\n\t")
(map infoTransition (IntMap.toAscList out))
infoAccept (Acc p act lctx rctx)
= str "\tAccept" . paren (shows p) . space
. outputLCtx lctx . space
. showRCtx rctx
. (case act of
Nothing -> id
Just code -> str " { " . str code . str " }")
. nl
infoTransition (char',state)
= str (ljustify 8 (show char'))
. str " -> "
. shows state
outputLCtx Nothing
= id
outputLCtx (Just set)
= paren (show set ++) . char '^'
-- outputArr arr
-- = str "Array.array " . shows (bounds arr) . space
-- . shows (assocs arr)
alex-3.2.5/src/Main.hs 0000644 0000000 0000000 00000044726 07346545000 012667 0 ustar 00 0000000 0000000 {-# LANGUAGE CPP #-}
-- -----------------------------------------------------------------------------
--
-- Main.hs, part of Alex
--
-- (c) Chris Dornan 1995-2000, Simon Marlow 2003
--
-- ----------------------------------------------------------------------------}
module Main (main) where
import AbsSyn
import CharSet
import DFA
import DFAMin
import NFA
import Info
import Map ( Map )
import qualified Map hiding ( Map )
import Output
import ParseMonad ( runP )
import Parser
import Scan
import Util ( hline )
import Paths_alex ( version, getDataDir )
#if __GLASGOW_HASKELL__ < 610
import Control.Exception as Exception ( block, unblock, catch, throw )
#endif
#if __GLASGOW_HASKELL__ >= 610
import Control.Exception ( bracketOnError )
#endif
import Control.Monad ( when, liftM )
import Data.Char ( chr )
import Data.List ( isSuffixOf, nub )
import Data.Maybe ( isJust, fromJust )
import Data.Version ( showVersion )
import System.Console.GetOpt ( getOpt, usageInfo, ArgOrder(..), OptDescr(..), ArgDescr(..) )
import System.Directory ( removeFile )
import System.Environment ( getProgName, getArgs )
import System.Exit ( ExitCode(..), exitWith )
import System.IO ( stderr, Handle, IOMode(..), openFile, hClose, hPutStr, hPutStrLn )
#if __GLASGOW_HASKELL__ >= 612
import System.IO ( hGetContents, hSetEncoding, utf8 )
#endif
-- We need to force every file we open to be read in
-- as UTF8
alexReadFile :: FilePath -> IO String
#if __GLASGOW_HASKELL__ >= 612
alexReadFile file = do
h <- alexOpenFile file ReadMode
hGetContents h
#else
alexReadFile = readFile
#endif
-- We need to force every file we write to be written
-- to as UTF8
alexOpenFile :: FilePath -> IOMode -> IO Handle
#if __GLASGOW_HASKELL__ >= 612
alexOpenFile file mode = do
h <- openFile file mode
hSetEncoding h utf8
return h
#else
alexOpenFile = openFile
#endif
-- `main' decodes the command line arguments and calls `alex'.
main:: IO ()
main = do
args <- getArgs
case getOpt Permute argInfo args of
(cli,_,[]) | DumpHelp `elem` cli -> do
prog <- getProgramName
bye (usageInfo (usageHeader prog) argInfo)
(cli,_,[]) | DumpVersion `elem` cli ->
bye copyright
(cli,[file],[]) ->
runAlex cli file
(_,_,errors) -> do
prog <- getProgramName
die (concat errors ++ usageInfo (usageHeader prog) argInfo)
projectVersion :: String
projectVersion = showVersion version
copyright :: String
copyright = "Alex version " ++ projectVersion ++ ", (c) 2003 Chris Dornan and Simon Marlow\n"
usageHeader :: String -> String
usageHeader prog = "Usage: " ++ prog ++ " [OPTION...] file\n"
runAlex :: [CLIFlags] -> FilePath -> IO ()
runAlex cli file = do
basename <- case (reverse file) of
'x':'.':r -> return (reverse r)
_ -> die (file ++ ": filename must end in \'.x\'\n")
prg <- alexReadFile file
script <- parseScript file prg
alex cli file basename script
parseScript :: FilePath -> String
-> IO (Maybe (AlexPosn,Code), [Directive], Scanner, Maybe (AlexPosn,Code))
parseScript file prg =
case runP prg initialParserEnv parse of
Left (Just (AlexPn _ line col),err) ->
die (file ++ ":" ++ show line ++ ":" ++ show col
++ ": " ++ err ++ "\n")
Left (Nothing, err) ->
die (file ++ ": " ++ err ++ "\n")
Right script -> return script
alex :: [CLIFlags] -> FilePath -> FilePath
-> (Maybe (AlexPosn, Code), [Directive], Scanner, Maybe (AlexPosn, Code))
-> IO ()
alex cli file basename script = do
(put_info, finish_info) <-
case [ f | OptInfoFile f <- cli ] of
[] -> return (\_ -> return (), return ())
[Nothing] -> infoStart file (basename ++ ".info")
[Just f] -> infoStart file f
_ -> dieAlex "multiple -i/--info options"
o_file <- case [ f | OptOutputFile f <- cli ] of
[] -> return (basename ++ ".hs")
[f] -> return f
_ -> dieAlex "multiple -o/--outfile options"
tab_size <- case [ s | OptTabSize s <- cli ] of
[] -> return (8 :: Int)
[s] -> case reads s of
[(n,"")] -> return n
_ -> dieAlex "-s/--tab-size option is not a valid integer"
_ -> dieAlex "multiple -s/--tab-size options"
let target
| OptGhcTarget `elem` cli = GhcTarget
| otherwise = HaskellTarget
let encodingsCli
| OptLatin1 `elem` cli = [Latin1]
| otherwise = []
template_dir <- templateDir getDataDir cli
let (maybe_header, directives, scanner1, maybe_footer) = script
scheme <- getScheme directives
-- open the output file; remove it if we encounter an error
bracketOnError
(alexOpenFile o_file WriteMode)
(\h -> do hClose h; removeFile o_file)
$ \out_h -> do
let
wrapper_name = wrapperFile template_dir scheme
(scanner2, scs, sc_hdr) = encodeStartCodes scanner1
(scanner_final, actions) = extractActions scheme scanner2
encodingsScript = [ e | EncodingDirective e <- directives ]
encoding <- case nub (encodingsCli ++ encodingsScript) of
[] -> return UTF8 -- default
[e] -> return e
_ | null encodingsCli -> dieAlex "conflicting %encoding directives"
| otherwise -> dieAlex "--latin1 flag conflicts with %encoding directive"
hPutStr out_h (optsToInject target cli)
injectCode maybe_header file out_h
hPutStr out_h (importsToInject target cli)
-- add the wrapper, if necessary
when (isJust wrapper_name) $
do str <- alexReadFile (fromJust wrapper_name)
hPutStr out_h str
-- Inject the tab size
hPutStrLn out_h $ "alex_tab_size :: Int"
hPutStrLn out_h $ "alex_tab_size = " ++ show (tab_size :: Int)
let dfa = scanner2dfa encoding scanner_final scs
min_dfa = minimizeDFA dfa
nm = scannerName scanner_final
usespreds = usesPreds min_dfa
put_info "\nStart codes\n"
put_info (show $ scs)
put_info "\nScanner\n"
put_info (show $ scanner_final)
put_info "\nNFA\n"
put_info (show $ scanner2nfa encoding scanner_final scs)
put_info "\nDFA"
put_info (infoDFA 1 nm dfa "")
put_info "\nMinimized DFA"
put_info (infoDFA 1 nm min_dfa "")
hPutStr out_h (outputDFA target 1 nm scheme min_dfa "")
injectCode maybe_footer file out_h
hPutStr out_h (sc_hdr "")
hPutStr out_h (actions "")
-- add the template
let template_name = templateFile template_dir target usespreds cli
tmplt <- alexReadFile template_name
hPutStr out_h tmplt
hClose out_h
finish_info
getScheme :: [Directive] -> IO Scheme
getScheme directives =
do
token <- case [ ty | TokenType ty <- directives ] of
[] -> return Nothing
[res] -> return (Just res)
_ -> dieAlex "multiple %token directives"
action <- case [ ty | ActionType ty <- directives ] of
[] -> return Nothing
[res] -> return (Just res)
_ -> dieAlex "multiple %action directives"
typeclass <- case [ tyclass | TypeClass tyclass <- directives ] of
[] -> return Nothing
[res] -> return (Just res)
_ -> dieAlex "multiple %typeclass directives"
case [ f | WrapperDirective f <- directives ] of
[] ->
case (typeclass, token, action) of
(Nothing, Nothing, Nothing) ->
return Default { defaultTypeInfo = Nothing }
(Nothing, Nothing, Just actionty) ->
return Default { defaultTypeInfo = Just (Nothing, actionty) }
(Just _, Nothing, Just actionty) ->
return Default { defaultTypeInfo = Just (typeclass, actionty) }
(_, Just _, _) ->
dieAlex "%token directive only allowed with a wrapper"
(Just _, Nothing, Nothing) ->
dieAlex "%typeclass directive without %token directive"
[single]
| single == "gscan" ->
case (typeclass, token, action) of
(Nothing, Nothing, Nothing) ->
return GScan { gscanTypeInfo = Nothing }
(Nothing, Just tokenty, Nothing) ->
return GScan { gscanTypeInfo = Just (Nothing, tokenty) }
(Just _, Just tokenty, Nothing) ->
return GScan { gscanTypeInfo = Just (typeclass, tokenty) }
(_, _, Just _) ->
dieAlex "%action directive not allowed with a wrapper"
(Just _, Nothing, Nothing) ->
dieAlex "%typeclass directive without %token directive"
| single == "basic" || single == "basic-bytestring" ||
single == "strict-bytestring" ->
let
strty = case single of
"basic" -> Str
"basic-bytestring" -> Lazy
"strict-bytestring" -> Strict
_ -> error "Impossible case"
in case (typeclass, token, action) of
(Nothing, Nothing, Nothing) ->
return Basic { basicStrType = strty,
basicTypeInfo = Nothing }
(Nothing, Just tokenty, Nothing) ->
return Basic { basicStrType = strty,
basicTypeInfo = Just (Nothing, tokenty) }
(Just _, Just tokenty, Nothing) ->
return Basic { basicStrType = strty,
basicTypeInfo = Just (typeclass, tokenty) }
(_, _, Just _) ->
dieAlex "%action directive not allowed with a wrapper"
(Just _, Nothing, Nothing) ->
dieAlex "%typeclass directive without %token directive"
| single == "posn" || single == "posn-bytestring" ->
let
isByteString = single == "posn-bytestring"
in case (typeclass, token, action) of
(Nothing, Nothing, Nothing) ->
return Posn { posnByteString = isByteString,
posnTypeInfo = Nothing }
(Nothing, Just tokenty, Nothing) ->
return Posn { posnByteString = isByteString,
posnTypeInfo = Just (Nothing, tokenty) }
(Just _, Just tokenty, Nothing) ->
return Posn { posnByteString = isByteString,
posnTypeInfo = Just (typeclass, tokenty) }
(_, _, Just _) ->
dieAlex "%action directive not allowed with a wrapper"
(Just _, Nothing, Nothing) ->
dieAlex "%typeclass directive without %token directive"
| single == "monad" || single == "monad-bytestring" ||
single == "monadUserState" ||
single == "monadUserState-bytestring" ->
let
isByteString = single == "monad-bytestring" ||
single == "monadUserState-bytestring"
userState = single == "monadUserState" ||
single == "monadUserState-bytestring"
in case (typeclass, token, action) of
(Nothing, Nothing, Nothing) ->
return Monad { monadByteString = isByteString,
monadUserState = userState,
monadTypeInfo = Nothing }
(Nothing, Just tokenty, Nothing) ->
return Monad { monadByteString = isByteString,
monadUserState = userState,
monadTypeInfo = Just (Nothing, tokenty) }
(Just _, Just tokenty, Nothing) ->
return Monad { monadByteString = isByteString,
monadUserState = userState,
monadTypeInfo = Just (typeclass, tokenty) }
(_, _, Just _) ->
dieAlex "%action directive not allowed with a wrapper"
(Just _, Nothing, Nothing) ->
dieAlex "%typeclass directive without %token directive"
| otherwise -> dieAlex ("unknown wrapper type " ++ single)
_many -> dieAlex "multiple %wrapper directives"
-- inject some code, and add a {-# LINE #-} pragma at the top
injectCode :: Maybe (AlexPosn,Code) -> FilePath -> Handle -> IO ()
injectCode Nothing _ _ = return ()
injectCode (Just (AlexPn _ ln _,code)) filename hdl = do
hPutStrLn hdl ("{-# LINE " ++ show ln ++ " \"" ++ filename ++ "\" #-}")
hPutStrLn hdl code
optsToInject :: Target -> [CLIFlags] -> String
optsToInject GhcTarget _ = optNoWarnings ++ "{-# LANGUAGE CPP,MagicHash #-}\n"
optsToInject _ _ = optNoWarnings ++ "{-# LANGUAGE CPP #-}\n"
optNoWarnings :: String
optNoWarnings = "{-# OPTIONS_GHC -fno-warn-unused-binds -fno-warn-missing-signatures #-}\n"
importsToInject :: Target -> [CLIFlags] -> String
importsToInject _ cli = always_imports ++ debug_imports ++ glaexts_import
where
glaexts_import | OptGhcTarget `elem` cli = import_glaexts
| otherwise = ""
debug_imports | OptDebugParser `elem` cli = import_debug
| otherwise = ""
-- CPP is turned on for -fglasogw-exts, so we can use conditional
-- compilation. We need to #include "config.h" to get hold of
-- WORDS_BIGENDIAN (see GenericTemplate.hs).
always_imports :: String
always_imports = "#if __GLASGOW_HASKELL__ >= 603\n" ++
"#include \"ghcconfig.h\"\n" ++
"#elif defined(__GLASGOW_HASKELL__)\n" ++
"#include \"config.h\"\n" ++
"#endif\n" ++
"#if __GLASGOW_HASKELL__ >= 503\n" ++
"import Data.Array\n" ++
"#else\n" ++
"import Array\n" ++
"#endif\n"
import_glaexts :: String
import_glaexts = "#if __GLASGOW_HASKELL__ >= 503\n" ++
"import Data.Array.Base (unsafeAt)\n" ++
"import GHC.Exts\n" ++
"#else\n" ++
"import GlaExts\n" ++
"#endif\n"
import_debug :: String
import_debug = "#if __GLASGOW_HASKELL__ >= 503\n" ++
"import System.IO\n" ++
"import System.IO.Unsafe\n" ++
"import Debug.Trace\n" ++
"#else\n" ++
"import IO\n" ++
"import IOExts\n" ++
"#endif\n"
templateDir :: IO FilePath -> [CLIFlags] -> IO FilePath
templateDir def cli
= case [ d | OptTemplateDir d <- cli ] of
[] -> def
ds -> return (last ds)
templateFile :: FilePath -> Target -> UsesPreds -> [CLIFlags] -> FilePath
templateFile dir target usespreds cli
= dir ++ "/AlexTemplate" ++ maybe_ghc ++ maybe_debug ++ maybe_nopred
where
maybe_ghc = case target of
GhcTarget -> "-ghc"
_ -> ""
maybe_debug
| OptDebugParser `elem` cli = "-debug"
| otherwise = ""
maybe_nopred =
case usespreds of
DoesntUsePreds | not (null maybe_ghc)
&& null maybe_debug -> "-nopred"
_ -> ""
wrapperFile :: FilePath -> Scheme -> Maybe FilePath
wrapperFile dir scheme =
do
f <- wrapperName scheme
return (dir ++ "/AlexWrapper-" ++ f)
infoStart :: FilePath -> FilePath -> IO (String -> IO (), IO ())
infoStart x_file info_file = do
bracketOnError
(alexOpenFile info_file WriteMode)
(\h -> do hClose h; removeFile info_file)
(\h -> do infoHeader h x_file
return (hPutStr h, hClose h)
)
infoHeader :: Handle -> FilePath -> IO ()
infoHeader h file = do
-- hSetBuffering h NoBuffering
hPutStrLn h ("Info file produced by Alex version " ++ projectVersion ++
", from " ++ file)
hPutStrLn h hline
hPutStr h "\n"
initialParserEnv :: (Map String CharSet, Map String RExp)
initialParserEnv = (initSetEnv, initREEnv)
initSetEnv :: Map String CharSet
initSetEnv = Map.fromList [("white", charSet " \t\n\v\f\r"),
("printable", charSetRange (chr 32) (chr 0x10FFFF)), -- FIXME: Look it up the unicode standard
(".", charSetComplement emptyCharSet
`charSetMinus` charSetSingleton '\n')]
initREEnv :: Map String RExp
initREEnv = Map.empty
-- -----------------------------------------------------------------------------
-- Command-line flags
data CLIFlags
= OptDebugParser
| OptGhcTarget
| OptOutputFile FilePath
| OptInfoFile (Maybe FilePath)
| OptTabSize String
| OptTemplateDir FilePath
| OptLatin1
| DumpHelp
| DumpVersion
deriving Eq
argInfo :: [OptDescr CLIFlags]
argInfo = [
Option ['o'] ["outfile"] (ReqArg OptOutputFile "FILE")
"write the output to FILE (default: file.hs)",
Option ['i'] ["info"] (OptArg OptInfoFile "FILE")
"put detailed state-machine info in FILE (or file.info)",
Option ['t'] ["template"] (ReqArg OptTemplateDir "DIR")
"look in DIR for template files",
Option ['g'] ["ghc"] (NoArg OptGhcTarget)
"use GHC extensions",
Option ['l'] ["latin1"] (NoArg OptLatin1)
"generated lexer will use the Latin-1 encoding instead of UTF-8",
Option ['s'] ["tab-size"] (ReqArg OptTabSize "NUMBER")
"set tab size to be used in the generated lexer (default: 8)",
Option ['d'] ["debug"] (NoArg OptDebugParser)
"produce a debugging scanner",
Option ['?'] ["help"] (NoArg DumpHelp)
"display this help and exit",
Option ['V','v'] ["version"] (NoArg DumpVersion) -- ToDo: -v is deprecated!
"output version information and exit"
]
-- -----------------------------------------------------------------------------
-- Utils
getProgramName :: IO String
getProgramName = liftM (`withoutSuffix` ".bin") getProgName
where str `withoutSuffix` suff
| suff `isSuffixOf` str = take (length str - length suff) str
| otherwise = str
bye :: String -> IO a
bye s = putStr s >> exitWith ExitSuccess
die :: String -> IO a
die s = hPutStr stderr s >> exitWith (ExitFailure 1)
dieAlex :: String -> IO a
dieAlex s = getProgramName >>= \prog -> die (prog ++ ": " ++ s)
#if __GLASGOW_HASKELL__ < 610
bracketOnError
:: IO a -- ^ computation to run first (\"acquire resource\")
-> (a -> IO b) -- ^ computation to run last (\"release resource\")
-> (a -> IO c) -- ^ computation to run in-between
-> IO c -- returns the value from the in-between computation
bracketOnError before after thing =
block (do
a <- before
r <- Exception.catch
(unblock (thing a))
(\e -> do { after a; throw e })
return r
)
#endif
alex-3.2.5/src/Map.hs 0000644 0000000 0000000 00000002721 07346545000 012505 0 ustar 00 0000000 0000000 {-# LANGUAGE CPP #-}
module Map (
Map,
member, lookup, findWithDefault,
empty,
insert, insertWith,
delete,
union, unionWith, unions,
mapWithKey,
elems,
fromList, fromListWith,
toAscList
) where
#if __GLASGOW_HASKELL__ >= 603
import Data.Map
import Prelude ()
#else
import Data.FiniteMap
import Prelude hiding ( lookup )
type Map k a = FiniteMap k a
member :: Ord k => k -> Map k a -> Bool
member = elemFM
lookup :: Ord k => k -> Map k a -> Maybe a
lookup = flip lookupFM
findWithDefault :: Ord k => a -> k -> Map k a -> a
findWithDefault a k m = lookupWithDefaultFM m a k
empty :: Map k a
empty = emptyFM
insert :: Ord k => k -> a -> Map k a -> Map k a
insert k a m = addToFM m k a
insertWith :: Ord k => (a -> a -> a) -> k -> a -> Map k a -> Map k a
insertWith c k a m = addToFM_C c m k a
delete :: Ord k => k -> Map k a -> Map k a
delete = flip delFromFM
union :: Ord k => Map k a -> Map k a -> Map k a
union = flip plusFM
unionWith :: Ord k => (a -> a -> a) -> Map k a -> Map k a -> Map k a
unionWith c l r = plusFM_C c r l
unions :: Ord k => [Map k a] -> Map k a
unions = foldl (flip plusFM) emptyFM
mapWithKey :: (k -> a -> b) -> Map k a -> Map k b
mapWithKey = mapFM
elems :: Map k a -> [a]
elems = eltsFM
fromList :: Ord k => [(k,a)] -> Map k a
fromList = listToFM
fromListWith :: Ord k => (a -> a -> a) -> [(k,a)] -> Map k a
fromListWith c = addListToFM_C (flip c) emptyFM
toAscList :: Map k a -> [(k,a)]
toAscList = fmToList
#endif
alex-3.2.5/src/NFA.hs 0000644 0000000 0000000 00000022406 07346545000 012376 0 ustar 00 0000000 0000000 -- -----------------------------------------------------------------------------
--
-- NFA.hs, part of Alex
--
-- (c) Chris Dornan 1995-2000, Simon Marlow 2003
--
-- The `scanner2nfa' takes a `Scanner' (see the `RExp' module) and
-- generates its equivelent nondeterministic finite automaton. NFAs
-- are turned into DFAs in the DFA module.
--
-- See the chapter on `Finite Automata and Lexical Analysis' in the
-- dragon book for an excellent overview of the algorithms in this
-- module.
--
-- ----------------------------------------------------------------------------}
module NFA where
import AbsSyn
import CharSet
import DFS ( t_close, out )
import Map ( Map )
import qualified Map hiding ( Map )
import Util ( str, space )
#if __GLASGOW_HASKELL__ < 710
import Control.Applicative ( Applicative(..) )
#endif
import Control.Monad ( forM_, zipWithM, zipWithM_, when, liftM, ap )
import Data.Array ( Array, (!), array, listArray, assocs, bounds )
-- Each state of a nondeterministic automaton contains a list of `Accept'
-- values, a list of epsilon transitions (an epsilon transition represents a
-- transition to another state that can be made without reading a character)
-- and a list of transitions qualified with a character predicate (the
-- transition can only be made to the given state on input of a character
-- permitted by the predicate). Although a list of `Accept' values is provided
-- for, in actual fact each state will have zero or one of them (the `Maybe'
-- type is not used because the flexibility offered by the list representation
-- is useful).
type NFA = Array SNum NState
data NState = NSt {
nst_accs :: [Accept Code],
nst_cl :: [SNum],
nst_outs :: [(ByteSet,SNum)]
}
-- Debug stuff
instance Show NState where
showsPrec _ (NSt accs cl outs) =
str "NSt " . shows accs . space . shows cl . space .
shows [ (c, s) | (c,s) <- outs ]
{- From the Scan Module
-- The `Accept' structure contains the priority of the token being accepted
-- (lower numbers => higher priorities), the name of the token, a place holder
-- that can be used for storing the `action' function, a list of start codes
-- (listing the start codes that the scanner must be in for the token to be
-- accepted; empty => no restriction), the leading and trailing context (both
-- `Nothing' if there is none).
--
-- The leading context consists simply of a character predicate that will
-- return true if the last character read is acceptable. The trailing context
-- consists of an alternative starting state within the DFA; if this `sub-dfa'
-- turns up any accepting state when applied to the residual input then the
-- trailing context is acceptable.
-}
-- `scanner2nfa' takes a scanner (see the AbsSyn module) and converts it to an
-- NFA, using the NFA creation monad (see below).
--
-- We generate a start state for each startcode, with the same number
-- as that startcode, and epsilon transitions from this state to each
-- of the sub-NFAs for each of the tokens acceptable in that startcode.
scanner2nfa:: Encoding -> Scanner -> [StartCode] -> NFA
scanner2nfa enc Scanner{scannerTokens = toks} startcodes
= runNFA enc $
do
-- make a start state for each start code (these will be
-- numbered from zero).
start_states <- sequence (replicate (length startcodes) newState)
-- construct the NFA for each token
tok_states <- zipWithM do_token toks [0..]
-- make an epsilon edge from each state state to each
-- token that is acceptable in that state
zipWithM_ (tok_transitions (zip toks tok_states))
startcodes start_states
where
do_token (RECtx _scs lctx re rctx code) prio = do
b <- newState
e <- newState
rexp2nfa b e re
rctx_e <- case rctx of
NoRightContext ->
return NoRightContext
RightContextCode code' ->
return (RightContextCode code')
RightContextRExp re' -> do
r_b <- newState
r_e <- newState
rexp2nfa r_b r_e re'
accept r_e rctxt_accept
return (RightContextRExp r_b)
let lctx' = case lctx of
Nothing -> Nothing
Just st -> Just st
accept e (Acc prio code lctx' rctx_e)
return b
tok_transitions toks_with_states start_code start_state = do
let states = [ s | (RECtx scs _ _ _ _, s) <- toks_with_states,
null scs || start_code `elem` map snd scs ]
mapM_ (epsilonEdge start_state) states
-- -----------------------------------------------------------------------------
-- NFA creation from a regular expression
-- rexp2nfa B E R generates an NFA that begins in state B, recognises
-- R, and ends in state E only if R has been recognised.
rexp2nfa :: SNum -> SNum -> RExp -> NFAM ()
rexp2nfa b e Eps = epsilonEdge b e
rexp2nfa b e (Ch p) = charEdge b p e
rexp2nfa b e (re1 :%% re2) = do
s <- newState
rexp2nfa b s re1
rexp2nfa s e re2
rexp2nfa b e (re1 :| re2) = do
rexp2nfa b e re1
rexp2nfa b e re2
rexp2nfa b e (Star re) = do
s <- newState
epsilonEdge b s
rexp2nfa s s re
epsilonEdge s e
rexp2nfa b e (Plus re) = do
s1 <- newState
s2 <- newState
rexp2nfa s1 s2 re
epsilonEdge b s1
epsilonEdge s2 s1
epsilonEdge s2 e
rexp2nfa b e (Ques re) = do
rexp2nfa b e re
epsilonEdge b e
-- -----------------------------------------------------------------------------
-- NFA creation monad.
-- Partial credit to Thomas Hallgren for this code, as I adapted it from
-- his "Lexing Haskell in Haskell" lexer generator.
type MapNFA = Map SNum NState
newtype NFAM a = N {unN :: SNum -> MapNFA -> Encoding -> (SNum, MapNFA, a)}
instance Functor NFAM where
fmap = liftM
instance Applicative NFAM where
pure a = N $ \s n _ -> (s,n,a)
(<*>) = ap
instance Monad NFAM where
return = pure
m >>= k = N $ \s n e -> case unN m s n e of
(s', n', a) -> unN (k a) s' n' e
runNFA :: Encoding -> NFAM () -> NFA
runNFA e m = case unN m 0 Map.empty e of
(s, nfa_map, ()) -> -- trace ("runNfa.." ++ show (Map.toAscList nfa_map)) $
e_close (array (0,s-1) (Map.toAscList nfa_map))
e_close:: Array Int NState -> NFA
e_close ar = listArray bds
[NSt accs (out gr v) outs|(v,NSt accs _ outs)<-assocs ar]
where
gr = t_close (hi+1,\v->nst_cl (ar!v))
bds@(_,hi) = bounds ar
newState :: NFAM SNum
newState = N $ \s n _ -> (s+1,n,s)
getEncoding :: NFAM Encoding
getEncoding = N $ \s n e -> (s,n,e)
anyBytes :: SNum -> Int -> SNum -> NFAM ()
anyBytes from 0 to = epsilonEdge from to
anyBytes from n to = do
s <- newState
byteEdge from (byteSetRange 0 0xff) s
anyBytes s (n-1) to
bytesEdge :: SNum -> [Byte] -> [Byte] -> SNum -> NFAM ()
bytesEdge from [] [] to = epsilonEdge from to
bytesEdge from [x] [y] to = byteEdge from (byteSetRange x y) to -- (OPTIMISATION)
bytesEdge from (x:xs) (y:ys) to
| x == y = do
s <- newState
byteEdge from (byteSetSingleton x) s
bytesEdge s xs ys to
| x < y = do
do s <- newState
byteEdge from (byteSetSingleton x) s
bytesEdge s xs (fmap (const 0xff) ys) to
do t <- newState
byteEdge from (byteSetSingleton y) t
bytesEdge t (fmap (const 0x00) xs) ys to
when ((x+1) <= (y-1)) $ do
u <- newState
byteEdge from (byteSetRange (x+1) (y-1)) u
anyBytes u (length xs) to
bytesEdge _ _ _ _ = undefined -- hide compiler warning
charEdge :: SNum -> CharSet -> SNum -> NFAM ()
charEdge from charset to = do
-- trace ("charEdge: " ++ (show $ charset) ++ " => " ++ show (byteRanges charset)) $
e <- getEncoding
forM_ (byteRanges e charset) $ \(xs,ys) -> do
bytesEdge from xs ys to
byteEdge :: SNum -> ByteSet -> SNum -> NFAM ()
byteEdge from charset to = N $ \s n _ -> (s, addEdge n, ())
where
addEdge n =
case Map.lookup from n of
Nothing ->
Map.insert from (NSt [] [] [(charset,to)]) n
Just (NSt acc eps trans) ->
Map.insert from (NSt acc eps ((charset,to):trans)) n
epsilonEdge :: SNum -> SNum -> NFAM ()
epsilonEdge from to
| from == to = return ()
| otherwise = N $ \s n _ -> let n' = addEdge n in n' `seq` (s, n', ())
where
addEdge n =
case Map.lookup from n of
Nothing -> Map.insert from (NSt [] [to] []) n
Just (NSt acc eps trans) -> Map.insert from (NSt acc (to:eps) trans) n
accept :: SNum -> Accept Code -> NFAM ()
accept state new_acc = N $ \s n _ -> (s, addAccept n, ())
where
addAccept n =
case Map.lookup state n of
Nothing ->
Map.insert state (NSt [new_acc] [] []) n
Just (NSt acc eps trans) ->
Map.insert state (NSt (new_acc:acc) eps trans) n
rctxt_accept :: Accept Code
rctxt_accept = Acc 0 Nothing Nothing NoRightContext
alex-3.2.5/src/Output.hs 0000644 0000000 0000000 00000055663 07346545000 013305 0 ustar 00 0000000 0000000 -- -----------------------------------------------------------------------------
--
-- Output.hs, part of Alex
--
-- (c) Simon Marlow 2003
--
-- Code-outputing and table-generation routines
--
-- ----------------------------------------------------------------------------}
module Output (outputDFA) where
import AbsSyn
import CharSet
import Util
import qualified Map
import qualified Data.IntMap as IntMap
import Control.Monad.ST ( ST, runST )
import Data.Array ( Array )
import Data.Array.Base ( unsafeRead )
import Data.Array.ST ( STUArray, newArray, readArray, writeArray, freeze )
import Data.Array.Unboxed ( UArray, elems, (!), array, listArray )
import Data.Maybe (isJust)
import Data.Bits
import Data.Char ( ord, chr )
import Data.List ( maximumBy, sortBy, groupBy, mapAccumR )
-- -----------------------------------------------------------------------------
-- Printing the output
outputDFA :: Target -> Int -> String -> Scheme -> DFA SNum Code -> ShowS
outputDFA target _ _ scheme dfa
= interleave_shows nl
[outputBase, outputTable, outputCheck, outputDefault,
outputAccept, outputActions, outputSigs]
where
(base, table, check, deflt, accept) = mkTables dfa
intty = case target of
GhcTarget -> "Int#"
HaskellTarget -> "Int"
table_size = length table - 1
n_states = length base - 1
base_nm = "alex_base"
table_nm = "alex_table"
check_nm = "alex_check"
deflt_nm = "alex_deflt"
accept_nm = "alex_accept"
actions_nm = "alex_actions"
outputBase = do_array hexChars32 base_nm n_states base
outputTable = do_array hexChars16 table_nm table_size table
outputCheck = do_array hexChars16 check_nm table_size check
outputDefault = do_array hexChars16 deflt_nm n_states deflt
formatArray :: String -> Int -> [ShowS] -> ShowS
formatArray constructFunction size contents =
str constructFunction
. str " (0 :: Int, " . shows size . str ")\n"
. str " [ "
. interleave_shows (str "\n , ") contents
. str "\n ]"
do_array hex_chars nm upper_bound ints = -- trace ("do_array: " ++ nm) $
case target of
GhcTarget ->
str nm . str " :: AlexAddr\n"
. str nm . str " = AlexA#\n"
. str " \"" . str (hex_chars ints) . str "\"#\n"
_ ->
str nm . str " :: Array Int Int\n"
. str nm . str " = "
. formatArray "listArray" upper_bound (map shows ints)
. nl
outputAccept :: ShowS
outputAccept =
-- Don't emit explicit type signature as it contains unknown user type,
-- see: https://github.com/simonmar/alex/issues/98
-- str accept_nm . str " :: Array Int (AlexAcc " . str userStateTy . str ")\n"
str accept_nm . str " = "
. formatArray "listArray" n_states (snd (mapAccumR outputAccs 0 accept))
. nl
gscanActionType res =
str "AlexPosn -> Char -> String -> Int -> ((Int, state) -> "
. str res . str ") -> (Int, state) -> " . str res
outputActions = signature . body
where
(nacts, acts) = mapAccumR outputActs 0 accept
actionsArray :: ShowS
actionsArray = formatArray "array" nacts (concat acts)
body :: ShowS
body = str actions_nm . str " = " . actionsArray . nl
signature :: ShowS
signature = case scheme of
Default { defaultTypeInfo = Just (Nothing, actionty) } ->
str actions_nm . str " :: Array Int (" . str actionty . str ")\n"
Default { defaultTypeInfo = Just (Just tyclasses, actionty) } ->
str actions_nm . str " :: (" . str tyclasses
. str ") => Array Int (" . str actionty . str ")\n"
GScan { gscanTypeInfo = Just (Nothing, toktype) } ->
str actions_nm . str " :: Array Int ("
. gscanActionType toktype . str ")\n"
GScan { gscanTypeInfo = Just (Just tyclasses, toktype) } ->
str actions_nm . str " :: (" . str tyclasses
. str ") => Array Int ("
. gscanActionType toktype . str ")\n"
Basic { basicStrType = strty,
basicTypeInfo = Just (Nothing, toktype) } ->
str actions_nm . str " :: Array Int ("
. str (show strty) . str " -> " . str toktype
. str ")\n"
Basic { basicStrType = strty,
basicTypeInfo = Just (Just tyclasses, toktype) } ->
str actions_nm . str " :: (" . str tyclasses
. str ") => Array Int ("
. str (show strty) . str " -> " . str toktype
. str ")\n"
Posn { posnByteString = isByteString,
posnTypeInfo = Just (Nothing, toktype) } ->
str actions_nm . str " :: Array Int (AlexPosn -> "
. str (strtype isByteString) . str " -> " . str toktype
. str ")\n"
Posn { posnByteString = isByteString,
posnTypeInfo = Just (Just tyclasses, toktype) } ->
str actions_nm . str " :: (" . str tyclasses
. str ") => Array Int (AlexPosn -> "
. str (strtype isByteString) . str " -> " . str toktype
. str ")\n"
Monad { monadByteString = isByteString,
monadTypeInfo = Just (Nothing, toktype) } ->
let
actintty = if isByteString then "Int64" else "Int"
in
str actions_nm . str " :: Array Int (AlexInput -> "
. str actintty . str " -> Alex(" . str toktype . str "))\n"
Monad { monadByteString = isByteString,
monadTypeInfo = Just (Just tyclasses, toktype) } ->
let
actintty = if isByteString then "Int64" else "Int"
in
str actions_nm . str " :: (" . str tyclasses
. str ") => Array Int (AlexInput -> "
. str actintty . str " -> Alex(" . str toktype . str "))\n"
_ ->
-- No type signature: we don't know what the type of the actions is.
-- str accept_nm . str " :: Array Int (Accept Code)\n"
id
outputSigs
= case scheme of
Default { defaultTypeInfo = Just (Nothing, toktype) } ->
str "alex_scan_tkn :: () -> AlexInput -> " . str intty
. str " -> " . str "AlexInput -> " . str intty
. str " -> AlexLastAcc -> (AlexLastAcc, AlexInput)\n"
. str "alexScanUser :: () -> AlexInput -> Int -> AlexReturn ("
. str toktype . str ")\n"
. str "alexScan :: AlexInput -> Int -> AlexReturn ("
. str toktype . str ")\n"
Default { defaultTypeInfo = Just (Just tyclasses, toktype) } ->
str "alex_scan_tkn :: () -> AlexInput -> " . str intty
. str " -> " . str "AlexInput -> " . str intty
. str " -> AlexLastAcc -> (AlexLastAcc, AlexInput)\n"
. str "alexScanUser :: (" . str tyclasses
. str ") => () -> AlexInput -> Int -> AlexReturn ("
. str toktype . str ")\n"
. str "alexScan :: (" . str tyclasses
. str ") => AlexInput -> Int -> AlexReturn ("
. str toktype . str ")\n"
GScan { gscanTypeInfo = Just (Nothing, toktype) } ->
str "alex_scan_tkn :: () -> AlexInput -> " . str intty
. str " -> " . str "AlexInput -> " . str intty
. str " -> AlexLastAcc -> (AlexLastAcc, AlexInput)\n"
. str "alexScanUser :: () -> AlexInput -> Int -> "
. str "AlexReturn (" . gscanActionType toktype . str ")\n"
. str "alexScan :: AlexInput -> Int -> AlexReturn ("
. gscanActionType toktype . str ")\n"
GScan { gscanTypeInfo = Just (Just tyclasses, toktype) } ->
str "alex_scan_tkn :: () -> AlexInput -> " . str intty
. str " -> " . str "AlexInput -> " . str intty
. str " -> AlexLastAcc -> (AlexLastAcc, AlexInput)\n"
. str "alexScanUser :: (" . str tyclasses
. str ") => () -> AlexInput -> Int -> AlexReturn ("
. gscanActionType toktype . str ")\n"
. str "alexScan :: (" . str tyclasses
. str ") => AlexInput -> Int -> AlexReturn ("
. gscanActionType toktype . str ")\n"
Basic { basicStrType = strty,
basicTypeInfo = Just (Nothing, toktype) } ->
str "alex_scan_tkn :: () -> AlexInput -> " . str intty
. str " -> " . str "AlexInput -> " . str intty
. str " -> AlexLastAcc -> (AlexLastAcc, AlexInput)\n"
. str "alexScanUser :: () -> AlexInput -> Int -> AlexReturn ("
. str (show strty) . str " -> " . str toktype . str ")\n"
. str "alexScan :: AlexInput -> Int -> AlexReturn ("
. str (show strty) . str " -> " . str toktype . str ")\n"
Basic { basicStrType = strty,
basicTypeInfo = Just (Just tyclasses, toktype) } ->
str "alex_scan_tkn :: () -> AlexInput -> " . str intty
. str " -> " . str "AlexInput -> " . str intty
. str " -> AlexLastAcc -> (AlexLastAcc, AlexInput)\n"
. str "alexScanUser :: (" . str tyclasses
. str ") => () -> AlexInput -> Int -> AlexReturn ("
. str (show strty) . str " -> " . str toktype . str ")\n"
. str "alexScan :: (" . str tyclasses
. str ") => AlexInput -> Int -> AlexReturn ("
. str (show strty) . str " -> " . str toktype . str ")\n"
Posn { posnByteString = isByteString,
posnTypeInfo = Just (Nothing, toktype) } ->
str "alex_scan_tkn :: () -> AlexInput -> " . str intty
. str " -> " . str "AlexInput -> " . str intty
. str " -> AlexLastAcc -> (AlexLastAcc, AlexInput)\n"
. str "alexScanUser :: () -> AlexInput -> Int -> AlexReturn (AlexPosn -> "
. str (strtype isByteString) . str " -> " . str toktype . str ")\n"
. str "alexScan :: AlexInput -> Int -> AlexReturn (AlexPosn -> "
. str (strtype isByteString) . str " -> " . str toktype . str ")\n"
Posn { posnByteString = isByteString,
posnTypeInfo = Just (Just tyclasses, toktype) } ->
str "alex_scan_tkn :: () -> AlexInput -> " . str intty
. str " -> " . str "AlexInput -> " . str intty
. str " -> AlexLastAcc -> (AlexLastAcc, AlexInput)\n"
. str "alexScanUser :: (" . str tyclasses
. str ") => () -> AlexInput -> Int -> AlexReturn (AlexPosn -> "
. str (strtype isByteString) . str " -> " . str toktype . str ")\n"
. str "alexScan :: (" . str tyclasses
. str ") => AlexInput -> Int -> AlexReturn (AlexPosn -> "
. str (strtype isByteString) . str " -> " . str toktype . str ")\n"
Monad { monadTypeInfo = Just (Nothing, toktype),
monadByteString = isByteString,
monadUserState = userState } ->
let
actintty = if isByteString then "Int64" else "Int"
userStateTy | userState = "AlexUserState"
| otherwise = "()"
in
str "alex_scan_tkn :: " . str userStateTy
. str " -> AlexInput -> " . str intty
. str " -> " . str "AlexInput -> " . str intty
. str " -> AlexLastAcc -> (AlexLastAcc, AlexInput)\n"
. str "alexScanUser :: " . str userStateTy
. str " -> AlexInput -> Int -> AlexReturn ("
. str "AlexInput -> " . str actintty . str " -> Alex ("
. str toktype . str "))\n"
. str "alexScan :: AlexInput -> Int -> AlexReturn ("
. str "AlexInput -> " . str actintty
. str " -> Alex (" . str toktype . str "))\n"
. str "alexMonadScan :: Alex (" . str toktype . str ")\n"
Monad { monadTypeInfo = Just (Just tyclasses, toktype),
monadByteString = isByteString,
monadUserState = userState } ->
let
actintty = if isByteString then "Int64" else "Int"
userStateTy | userState = "AlexUserState"
| otherwise = "()"
in
str "alex_scan_tkn :: " . str userStateTy
. str " -> AlexInput -> " . str intty
. str " -> " . str "AlexInput -> " . str intty
. str " -> AlexLastAcc -> (AlexLastAcc, AlexInput)\n"
. str "alexScanUser :: (" . str tyclasses . str ") => "
. str userStateTy . str " -> AlexInput -> Int -> AlexReturn ("
. str "AlexInput -> " . str actintty
. str " -> Alex (" . str toktype . str "))\n"
. str "alexScan :: (" . str tyclasses
. str ") => AlexInput -> Int -> AlexReturn ("
. str "AlexInput -> " . str actintty
. str " -> Alex (" . str toktype . str "))\n"
. str "alexMonadScan :: (" . str tyclasses
. str ") => Alex (" . str toktype . str ")\n"
_ ->
str ""
outputAccs :: Int -> [Accept Code] -> (Int, ShowS)
outputAccs idx [] = (idx, str "AlexAccNone")
outputAccs idx (Acc _ Nothing Nothing NoRightContext : [])
= (idx, str "AlexAccSkip")
outputAccs idx (Acc _ (Just _) Nothing NoRightContext : [])
= (idx + 1, str "AlexAcc " . str (show idx))
outputAccs idx (Acc _ Nothing lctx rctx : rest)
= let (idx', rest') = outputAccs idx rest
in (idx', str "AlexAccSkipPred" . space
. paren (outputPred lctx rctx)
. paren rest')
outputAccs idx (Acc _ (Just _) lctx rctx : rest)
= let (idx', rest') = outputAccs idx rest
in (idx' + 1, str "AlexAccPred" . space
. str (show idx') . space
. paren (outputPred lctx rctx)
. paren rest')
outputActs :: Int -> [Accept Code] -> (Int, [ShowS])
outputActs idx =
let
outputAct _ (Acc _ Nothing _ _) = error "Shouldn't see this"
outputAct inneridx (Acc _ (Just act) _ _) =
(inneridx + 1, paren (shows inneridx . str "," . str act))
in
mapAccumR outputAct idx . filter (\(Acc _ act _ _) -> isJust act)
outputPred (Just set) NoRightContext
= outputLCtx set
outputPred Nothing rctx
= outputRCtx rctx
outputPred (Just set) rctx
= outputLCtx set
. str " `alexAndPred` "
. outputRCtx rctx
outputLCtx set = str "alexPrevCharMatches" . str (charSetQuote set)
outputRCtx NoRightContext = id
outputRCtx (RightContextRExp sn)
= str "alexRightContext " . shows sn
outputRCtx (RightContextCode code)
= str code
-- outputArr arr
-- = str "array " . shows (bounds arr) . space
-- . shows (assocs arr)
-- -----------------------------------------------------------------------------
-- Generating arrays.
-- Here we use the table-compression algorithm described in section
-- 3.9 of the dragon book, which is a common technique used by lexical
-- analyser generators.
-- We want to generate:
--
-- base :: Array SNum Int
-- maps the current state to an offset in the main table
--
-- table :: Array Int SNum
-- maps (base!state + char) to the next state
--
-- check :: Array Int SNum
-- maps (base!state + char) to state if table entry is valid,
-- otherwise we use the default for this state
--
-- default :: Array SNum SNum
-- default production for this state
--
-- accept :: Array SNum [Accept Code]
-- maps state to list of accept codes for this state
--
-- For each state, we decide what will be the default symbol (pick the
-- most common). We now have a mapping Char -> SNum, with one special
-- state reserved as the default.
mkTables :: DFA SNum Code
-> (
[Int], -- base
[Int], -- table
[Int], -- check
[Int], -- default
[[Accept Code]] -- accept
)
mkTables dfa = -- trace (show (defaults)) $
-- trace (show (fmap (length . snd) dfa_no_defaults)) $
( elems base_offs,
take max_off (elems table),
take max_off (elems check),
elems defaults,
accept
)
where
accept = [ as | State as _ <- elems dfa_arr ]
state_assocs = Map.toAscList (dfa_states dfa)
n_states = length state_assocs
top_state = n_states - 1
dfa_arr :: Array SNum (State SNum Code)
dfa_arr = array (0,top_state) state_assocs
-- fill in all the error productions
expand_states =
[ expand (dfa_arr!state) | state <- [0..top_state] ]
expand (State _ out) =
[(i, lookup' out i) | i <- [0..0xff]]
where lookup' out' i = case IntMap.lookup i out' of
Nothing -> -1
Just s -> s
defaults :: UArray SNum SNum
defaults = listArray (0,top_state) (map best_default expand_states)
-- find the most common destination state in a given state, and
-- make it the default.
best_default :: [(Int,SNum)] -> SNum
best_default prod_list
| null sorted = -1
| otherwise = snd (head (maximumBy lengths eq))
where sorted = sortBy compareSnds prod_list
compareSnds (_,a) (_,b) = compare a b
eq = groupBy (\(_,a) (_,b) -> a == b) sorted
lengths a b = length a `compare` length b
-- remove all the default productions from the DFA
dfa_no_defaults =
[ (s, prods_without_defaults s out)
| (s, out) <- zip [0..] expand_states
]
prods_without_defaults s out
= [ (fromIntegral c, dest) | (c,dest) <- out, dest /= defaults!s ]
(base_offs, table, check, max_off)
= runST (genTables n_states 255 dfa_no_defaults)
genTables
:: Int -- number of states
-> Int -- maximum token no.
-> [(SNum,[(Int,SNum)])] -- entries for the table
-> ST s (UArray Int Int, -- base
UArray Int Int, -- table
UArray Int Int, -- check
Int -- highest offset in table
)
genTables n_states max_token entries = do
base <- newArray (0, n_states-1) 0
table <- newArray (0, mAX_TABLE_SIZE) 0
check <- newArray (0, mAX_TABLE_SIZE) (-1)
off_arr <- newArray (-max_token, mAX_TABLE_SIZE) 0
max_off <- genTables' base table check off_arr entries max_token
base' <- freeze base
table' <- freeze table
check' <- freeze check
return (base', table',check',max_off+1)
where mAX_TABLE_SIZE = n_states * (max_token + 1)
genTables'
:: STUArray s Int Int -- base
-> STUArray s Int Int -- table
-> STUArray s Int Int -- check
-> STUArray s Int Int -- offset array
-> [(SNum,[(Int,SNum)])] -- entries for the table
-> Int -- maximum token no.
-> ST s Int -- highest offset in table
genTables' base table check off_arr entries max_token
= fit_all entries 0 1
where
fit_all [] max_off _ = return max_off
fit_all (s:ss) max_off fst_zero = do
(off, new_max_off, new_fst_zero) <- fit s max_off fst_zero
writeArray off_arr off 1
fit_all ss new_max_off new_fst_zero
-- fit a vector into the table. Return the offset of the vector,
-- the maximum offset used in the table, and the offset of the first
-- entry in the table (used to speed up the lookups a bit).
fit (_,[]) max_off fst_zero = return (0,max_off,fst_zero)
fit (state_no, state@((t,_):_)) max_off fst_zero = do
-- start at offset 1 in the table: all the empty states
-- (states with just a default reduction) are mapped to
-- offset zero.
off <- findFreeOffset (-t + fst_zero) check off_arr state
let new_max_off | furthest_right > max_off = furthest_right
| otherwise = max_off
furthest_right = off + max_token
--trace ("fit: state " ++ show state_no ++ ", off " ++ show off ++ ", elems " ++ show state) $ do
writeArray base state_no off
addState off table check state
new_fst_zero <- findFstFreeSlot check fst_zero
return (off, new_max_off, new_fst_zero)
-- Find a valid offset in the table for this state.
findFreeOffset :: Int
-> STUArray s Int Int
-> STUArray s Int Int
-> [(Int, Int)]
-> ST s Int
findFreeOffset off check off_arr state = do
-- offset 0 isn't allowed
if off == 0 then try_next else do
-- don't use an offset we've used before
b <- readArray off_arr off
if b /= 0 then try_next else do
-- check whether the actions for this state fit in the table
ok <- fits off state check
if ok then return off else try_next
where
try_next = findFreeOffset (off+1) check off_arr state
-- This is an inner loop, so we use some strictness hacks, and avoid
-- array bounds checks (unsafeRead instead of readArray) to speed
-- things up a bit.
fits :: Int -> [(Int,Int)] -> STUArray s Int Int -> ST s Bool
fits off [] check = off `seq` check `seq` return True -- strictness hacks
fits off ((t,_):rest) check = do
i <- unsafeRead check (off+t)
if i /= -1 then return False
else fits off rest check
addState :: Int -> STUArray s Int Int -> STUArray s Int Int -> [(Int, Int)]
-> ST s ()
addState _ _ _ [] = return ()
addState off table check ((t,val):state) = do
writeArray table (off+t) val
writeArray check (off+t) t
addState off table check state
findFstFreeSlot :: STUArray s Int Int -> Int -> ST s Int
findFstFreeSlot table n = do
i <- readArray table n
if i == -1 then return n
else findFstFreeSlot table (n+1)
-----------------------------------------------------------------------------
-- Convert an integer to a 16-bit number encoded in \xNN\xNN format suitable
-- for placing in a string (copied from Happy's ProduceCode.lhs)
hexChars16 :: [Int] -> String
hexChars16 acts = concat (map conv16 acts)
where
conv16 i | i > 0x7fff || i < -0x8000
= error ("Internal error: hexChars16: out of range: " ++ show i)
| otherwise
= hexChar16 i
hexChars32 :: [Int] -> String
hexChars32 acts = concat (map conv32 acts)
where
conv32 i = hexChar16 (i .&. 0xffff) ++
hexChar16 ((i `shiftR` 16) .&. 0xffff)
hexChar16 :: Int -> String
hexChar16 i = toHex (i .&. 0xff)
++ toHex ((i `shiftR` 8) .&. 0xff) -- force little-endian
toHex :: Int -> String
toHex i = ['\\','x', hexDig (i `div` 16), hexDig (i `mod` 16)]
hexDig :: Int -> Char
hexDig i | i <= 9 = chr (i + ord '0')
| otherwise = chr (i - 10 + ord 'a')
alex-3.2.5/src/ParseMonad.hs 0000644 0000000 0000000 00000011727 07346545000 014027 0 ustar 00 0000000 0000000 -- -----------------------------------------------------------------------------
--
-- ParseMonad.hs, part of Alex
--
-- (c) Simon Marlow 2003
--
-- ----------------------------------------------------------------------------}
module ParseMonad (
AlexInput, alexInputPrevChar, alexGetChar, alexGetByte,
AlexPosn(..), alexStartPos,
P, runP, StartCode, failP, lookupSMac, lookupRMac, newSMac, newRMac,
setStartCode, getStartCode, getInput, setInput,
) where
import AbsSyn hiding ( StartCode )
import CharSet ( CharSet )
import Map ( Map )
import qualified Map hiding ( Map )
import UTF8
#if __GLASGOW_HASKELL__ < 710
import Control.Applicative ( Applicative(..) )
#endif
import Control.Monad ( liftM, ap )
import Data.Word (Word8)
-- -----------------------------------------------------------------------------
-- The input type
--import Codec.Binary.UTF8.Light as UTF8
type Byte = Word8
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
[Byte],
String) -- current input string
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (_,c,_,_) = c
alexGetChar :: AlexInput -> Maybe (Char,AlexInput)
alexGetChar (_,_,[],[]) = Nothing
alexGetChar (p,_,[],(c:s)) = let p' = alexMove p c in p' `seq`
Just (c, (p', c, [], s))
alexGetChar (_, _ ,_ : _, _) = undefined -- hide compiler warning
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (p,c,(b:bs),s) = Just (b,(p,c,bs,s))
alexGetByte (_,_,[],[]) = Nothing
alexGetByte (p,_,[],(c:s)) = let p' = alexMove p c
(b:bs) = UTF8.encode c
in p' `seq` Just (b, (p', c, bs, s))
-- -----------------------------------------------------------------------------
-- Token positions
-- `Posn' records the location of a token in the input text. It has three
-- fields: the address (number of chacaters preceding the token), line number
-- and column of a token within the file. `start_pos' gives the position of the
-- start of the file and `eof_pos' a standard encoding for the end of file.
-- `move_pos' calculates the new position after traversing a given character,
-- assuming the usual eight character tab stops.
data AlexPosn = AlexPn !Int !Int !Int
deriving (Eq,Show)
alexStartPos :: AlexPosn
alexStartPos = AlexPn 0 1 1
alexMove :: AlexPosn -> Char -> AlexPosn
alexMove (AlexPn a l c) '\t' = AlexPn (a+1) l (((c+7) `div` 8)*8+1)
alexMove (AlexPn a l _) '\n' = AlexPn (a+1) (l+1) 1
alexMove (AlexPn a l c) _ = AlexPn (a+1) l (c+1)
-- -----------------------------------------------------------------------------
-- Alex lexing/parsing monad
type ParseError = (Maybe AlexPosn, String)
type StartCode = Int
data PState = PState {
smac_env :: Map String CharSet,
rmac_env :: Map String RExp,
startcode :: Int,
input :: AlexInput
}
newtype P a = P { unP :: PState -> Either ParseError (PState,a) }
instance Functor P where
fmap = liftM
instance Applicative P where
pure a = P $ \env -> Right (env,a)
(<*>) = ap
instance Monad P where
(P m) >>= k = P $ \env -> case m env of
Left err -> Left err
Right (env',ok) -> unP (k ok) env'
return = pure
runP :: String -> (Map String CharSet, Map String RExp)
-> P a -> Either ParseError a
runP str (senv,renv) (P p)
= case p initial_state of
Left err -> Left err
Right (_,a) -> Right a
where initial_state =
PState{ smac_env=senv, rmac_env=renv,
startcode = 0, input=(alexStartPos,'\n',[],str) }
failP :: String -> P a
failP str = P $ \PState{ input = (p,_,_,_) } -> Left (Just p,str)
-- Macros are expanded during parsing, to simplify the abstract
-- syntax. The parsing monad passes around two environments mapping
-- macro names to sets and regexps respectively.
lookupSMac :: (AlexPosn,String) -> P CharSet
lookupSMac (posn,smac)
= P $ \s@PState{ smac_env = senv } ->
case Map.lookup smac senv of
Just ok -> Right (s,ok)
Nothing -> Left (Just posn, "unknown set macro: $" ++ smac)
lookupRMac :: String -> P RExp
lookupRMac rmac
= P $ \s@PState{ rmac_env = renv } ->
case Map.lookup rmac renv of
Just ok -> Right (s,ok)
Nothing -> Left (Nothing, "unknown regex macro: %" ++ rmac)
newSMac :: String -> CharSet -> P ()
newSMac smac set
= P $ \s -> Right (s{smac_env = Map.insert smac set (smac_env s)}, ())
newRMac :: String -> RExp -> P ()
newRMac rmac rexp
= P $ \s -> Right (s{rmac_env = Map.insert rmac rexp (rmac_env s)}, ())
setStartCode :: StartCode -> P ()
setStartCode sc = P $ \s -> Right (s{ startcode = sc }, ())
getStartCode :: P StartCode
getStartCode = P $ \s -> Right (s, startcode s)
getInput :: P AlexInput
getInput = P $ \s -> Right (s, input s)
setInput :: AlexInput -> P ()
setInput inp = P $ \s -> Right (s{ input = inp }, ())
alex-3.2.5/src/Parser.hs 0000644 0000000 0000000 00000172772 07346545000 013242 0 ustar 00 0000000 0000000 {-# OPTIONS_GHC -w #-}
{-# OPTIONS -XMagicHash -XBangPatterns -XTypeSynonymInstances -XFlexibleInstances -cpp #-}
#if __GLASGOW_HASKELL__ >= 710
{-# OPTIONS_GHC -XPartialTypeSignatures #-}
#endif
-- -----------------------------------------------------------------------------
--
-- Parser.y, part of Alex
--
-- (c) Simon Marlow 2003
--
-- -----------------------------------------------------------------------------
{-# OPTIONS_GHC -w #-}
module Parser ( parse, P ) where
import AbsSyn
import Scan
import CharSet
import ParseMonad hiding ( StartCode )
import Data.Char
--import Debug.Trace
import qualified Data.Array as Happy_Data_Array
import qualified Data.Bits as Bits
import qualified GHC.Exts as Happy_GHC_Exts
import Control.Applicative(Applicative(..))
import Control.Monad (ap)
-- parser produced by Happy Version 1.19.10
newtype HappyAbsSyn = HappyAbsSyn HappyAny
#if __GLASGOW_HASKELL__ >= 607
type HappyAny = Happy_GHC_Exts.Any
#else
type HappyAny = forall a . a
#endif
newtype HappyWrap4 = HappyWrap4 ((Maybe (AlexPosn,Code), [Directive], Scanner, Maybe (AlexPosn,Code)))
happyIn4 :: ((Maybe (AlexPosn,Code), [Directive], Scanner, Maybe (AlexPosn,Code))) -> (HappyAbsSyn )
happyIn4 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap4 x)
{-# INLINE happyIn4 #-}
happyOut4 :: (HappyAbsSyn ) -> HappyWrap4
happyOut4 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut4 #-}
newtype HappyWrap5 = HappyWrap5 (Maybe (AlexPosn,Code))
happyIn5 :: (Maybe (AlexPosn,Code)) -> (HappyAbsSyn )
happyIn5 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap5 x)
{-# INLINE happyIn5 #-}
happyOut5 :: (HappyAbsSyn ) -> HappyWrap5
happyOut5 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut5 #-}
newtype HappyWrap6 = HappyWrap6 ([Directive])
happyIn6 :: ([Directive]) -> (HappyAbsSyn )
happyIn6 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap6 x)
{-# INLINE happyIn6 #-}
happyOut6 :: (HappyAbsSyn ) -> HappyWrap6
happyOut6 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut6 #-}
newtype HappyWrap7 = HappyWrap7 (Directive)
happyIn7 :: (Directive) -> (HappyAbsSyn )
happyIn7 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap7 x)
{-# INLINE happyIn7 #-}
happyOut7 :: (HappyAbsSyn ) -> HappyWrap7
happyOut7 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut7 #-}
newtype HappyWrap8 = HappyWrap8 (Encoding)
happyIn8 :: (Encoding) -> (HappyAbsSyn )
happyIn8 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap8 x)
{-# INLINE happyIn8 #-}
happyOut8 :: (HappyAbsSyn ) -> HappyWrap8
happyOut8 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut8 #-}
newtype HappyWrap9 = HappyWrap9 (())
happyIn9 :: (()) -> (HappyAbsSyn )
happyIn9 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap9 x)
{-# INLINE happyIn9 #-}
happyOut9 :: (HappyAbsSyn ) -> HappyWrap9
happyOut9 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut9 #-}
newtype HappyWrap10 = HappyWrap10 (())
happyIn10 :: (()) -> (HappyAbsSyn )
happyIn10 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap10 x)
{-# INLINE happyIn10 #-}
happyOut10 :: (HappyAbsSyn ) -> HappyWrap10
happyOut10 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut10 #-}
newtype HappyWrap11 = HappyWrap11 (Scanner)
happyIn11 :: (Scanner) -> (HappyAbsSyn )
happyIn11 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap11 x)
{-# INLINE happyIn11 #-}
happyOut11 :: (HappyAbsSyn ) -> HappyWrap11
happyOut11 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut11 #-}
newtype HappyWrap12 = HappyWrap12 ([RECtx])
happyIn12 :: ([RECtx]) -> (HappyAbsSyn )
happyIn12 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap12 x)
{-# INLINE happyIn12 #-}
happyOut12 :: (HappyAbsSyn ) -> HappyWrap12
happyOut12 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut12 #-}
newtype HappyWrap13 = HappyWrap13 ([RECtx])
happyIn13 :: ([RECtx]) -> (HappyAbsSyn )
happyIn13 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap13 x)
{-# INLINE happyIn13 #-}
happyOut13 :: (HappyAbsSyn ) -> HappyWrap13
happyOut13 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut13 #-}
newtype HappyWrap14 = HappyWrap14 (RECtx)
happyIn14 :: (RECtx) -> (HappyAbsSyn )
happyIn14 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap14 x)
{-# INLINE happyIn14 #-}
happyOut14 :: (HappyAbsSyn ) -> HappyWrap14
happyOut14 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut14 #-}
newtype HappyWrap15 = HappyWrap15 ([RECtx])
happyIn15 :: ([RECtx]) -> (HappyAbsSyn )
happyIn15 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap15 x)
{-# INLINE happyIn15 #-}
happyOut15 :: (HappyAbsSyn ) -> HappyWrap15
happyOut15 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut15 #-}
newtype HappyWrap16 = HappyWrap16 ([(String,StartCode)])
happyIn16 :: ([(String,StartCode)]) -> (HappyAbsSyn )
happyIn16 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap16 x)
{-# INLINE happyIn16 #-}
happyOut16 :: (HappyAbsSyn ) -> HappyWrap16
happyOut16 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut16 #-}
newtype HappyWrap17 = HappyWrap17 ([(String,StartCode)])
happyIn17 :: ([(String,StartCode)]) -> (HappyAbsSyn )
happyIn17 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap17 x)
{-# INLINE happyIn17 #-}
happyOut17 :: (HappyAbsSyn ) -> HappyWrap17
happyOut17 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut17 #-}
newtype HappyWrap18 = HappyWrap18 (String)
happyIn18 :: (String) -> (HappyAbsSyn )
happyIn18 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap18 x)
{-# INLINE happyIn18 #-}
happyOut18 :: (HappyAbsSyn ) -> HappyWrap18
happyOut18 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut18 #-}
newtype HappyWrap19 = HappyWrap19 (Maybe Code)
happyIn19 :: (Maybe Code) -> (HappyAbsSyn )
happyIn19 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap19 x)
{-# INLINE happyIn19 #-}
happyOut19 :: (HappyAbsSyn ) -> HappyWrap19
happyOut19 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut19 #-}
newtype HappyWrap20 = HappyWrap20 (Maybe CharSet, RExp, RightContext RExp)
happyIn20 :: (Maybe CharSet, RExp, RightContext RExp) -> (HappyAbsSyn )
happyIn20 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap20 x)
{-# INLINE happyIn20 #-}
happyOut20 :: (HappyAbsSyn ) -> HappyWrap20
happyOut20 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut20 #-}
newtype HappyWrap21 = HappyWrap21 (CharSet)
happyIn21 :: (CharSet) -> (HappyAbsSyn )
happyIn21 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap21 x)
{-# INLINE happyIn21 #-}
happyOut21 :: (HappyAbsSyn ) -> HappyWrap21
happyOut21 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut21 #-}
newtype HappyWrap22 = HappyWrap22 (RightContext RExp)
happyIn22 :: (RightContext RExp) -> (HappyAbsSyn )
happyIn22 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap22 x)
{-# INLINE happyIn22 #-}
happyOut22 :: (HappyAbsSyn ) -> HappyWrap22
happyOut22 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut22 #-}
newtype HappyWrap23 = HappyWrap23 (RExp)
happyIn23 :: (RExp) -> (HappyAbsSyn )
happyIn23 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap23 x)
{-# INLINE happyIn23 #-}
happyOut23 :: (HappyAbsSyn ) -> HappyWrap23
happyOut23 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut23 #-}
newtype HappyWrap24 = HappyWrap24 (RExp)
happyIn24 :: (RExp) -> (HappyAbsSyn )
happyIn24 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap24 x)
{-# INLINE happyIn24 #-}
happyOut24 :: (HappyAbsSyn ) -> HappyWrap24
happyOut24 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut24 #-}
newtype HappyWrap25 = HappyWrap25 (RExp)
happyIn25 :: (RExp) -> (HappyAbsSyn )
happyIn25 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap25 x)
{-# INLINE happyIn25 #-}
happyOut25 :: (HappyAbsSyn ) -> HappyWrap25
happyOut25 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut25 #-}
newtype HappyWrap26 = HappyWrap26 (RExp -> RExp)
happyIn26 :: (RExp -> RExp) -> (HappyAbsSyn )
happyIn26 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap26 x)
{-# INLINE happyIn26 #-}
happyOut26 :: (HappyAbsSyn ) -> HappyWrap26
happyOut26 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut26 #-}
newtype HappyWrap27 = HappyWrap27 (RExp)
happyIn27 :: (RExp) -> (HappyAbsSyn )
happyIn27 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap27 x)
{-# INLINE happyIn27 #-}
happyOut27 :: (HappyAbsSyn ) -> HappyWrap27
happyOut27 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut27 #-}
newtype HappyWrap28 = HappyWrap28 (CharSet)
happyIn28 :: (CharSet) -> (HappyAbsSyn )
happyIn28 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap28 x)
{-# INLINE happyIn28 #-}
happyOut28 :: (HappyAbsSyn ) -> HappyWrap28
happyOut28 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut28 #-}
newtype HappyWrap29 = HappyWrap29 (CharSet)
happyIn29 :: (CharSet) -> (HappyAbsSyn )
happyIn29 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap29 x)
{-# INLINE happyIn29 #-}
happyOut29 :: (HappyAbsSyn ) -> HappyWrap29
happyOut29 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut29 #-}
newtype HappyWrap30 = HappyWrap30 ([CharSet])
happyIn30 :: ([CharSet]) -> (HappyAbsSyn )
happyIn30 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap30 x)
{-# INLINE happyIn30 #-}
happyOut30 :: (HappyAbsSyn ) -> HappyWrap30
happyOut30 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut30 #-}
newtype HappyWrap31 = HappyWrap31 ((AlexPosn,String))
happyIn31 :: ((AlexPosn,String)) -> (HappyAbsSyn )
happyIn31 x = Happy_GHC_Exts.unsafeCoerce# (HappyWrap31 x)
{-# INLINE happyIn31 #-}
happyOut31 :: (HappyAbsSyn ) -> HappyWrap31
happyOut31 x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOut31 #-}
happyInTok :: (Token) -> (HappyAbsSyn )
happyInTok x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyInTok #-}
happyOutTok :: (HappyAbsSyn ) -> (Token)
happyOutTok x = Happy_GHC_Exts.unsafeCoerce# x
{-# INLINE happyOutTok #-}
happyExpList :: HappyAddr
happyExpList = HappyA# "\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x10\x00\x00\x00\x00\x00\x00\x00\x00\xc0\x07\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x03\x00\x00\x00\x00\x00\x00\x00\xc0\x07\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\x00\x00\x00\x00\x20\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\x00\x00\x00\x00\x20\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x00\x30\x00\x00\x00\x00\x08\x00\x14\x60\x00\x00\x00\x00\x80\x00\x48\x21\x0e\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80\x20\x48\x21\x0e\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xc0\x03\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\x80\x15\xe2\x00\x00\x00\x00\x80\x00\x40\x01\x06\x00\x00\x00\x00\x08\x00\x54\x60\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x20\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x28\x80\x54\xe2\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x28\x80\x54\xe2\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\xa0\x54\xe2\x00\x00\x00\x00\x00\x01\x00\x00\x01\x00\x00\x00\x00\x08\x80\x14\xe2\x00\x00\x00\x00\x00\x10\x00\x08\x00\x00\x00\x00\x00\x00\x00\x42\x00\x00\x00\x00\x00\x00\x00\x00\x90\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\x00\x14\x60\x00\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\x00\x08\x00\x16\x60\x00\x00\x00\x00\x00\x00\x00\x02\x00\x00\x00\x00\x00\x08\x00\x14\x60\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x20\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\x80\x14\xe2\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x80\x40\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x20\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x80\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\x80\x14\xf2\x00\x00\x00\x00\x00\x10\x00\x08\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\x80\x54\xe2\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\x80\x54\xe2\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x90\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x40\x00\x20\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x04\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"#
{-# NOINLINE happyExpListPerState #-}
happyExpListPerState st =
token_strs_expected
where token_strs = ["error","%dummy","%start_parse","alex","maybe_code","directives","directive","encoding","macdefs","macdef","scanner","tokendefs","tokendef","rule","rules","startcodes","startcodes0","startcode","rhs","context","left_ctx","right_ctx","rexp","alt","term","rep","rexp0","set","set0","sets","smac","'.'","';'","'<'","'>'","','","'$'","'|'","'*'","'+'","'?'","'{'","'}'","'('","')'","'#'","'~'","'-'","'['","']'","'^'","'/'","ZERO","STRING","BIND","ID","CODE","CHAR","SMAC","RMAC","SMAC_DEF","RMAC_DEF","WRAPPER","ENCODING","ACTIONTYPE","TOKENTYPE","TYPECLASS","%eof"]
bit_start = st * 68
bit_end = (st + 1) * 68
read_bit = readArrayBit happyExpList
bits = map read_bit [bit_start..bit_end - 1]
bits_indexed = zip bits [0..67]
token_strs_expected = concatMap f bits_indexed
f (False, _) = []
f (True, nr) = [token_strs !! nr]
happyActOffsets :: HappyAddr
happyActOffsets = HappyA# "\xec\xff\xec\xff\xf3\x00\x00\x00\xe5\xff\xfc\xff\xf3\x00\x0d\x00\x18\x00\x2e\x00\x40\x00\x45\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x49\x00\xfc\xff\x7d\x00\x6f\x00\x00\x00\x27\x00\x00\x00\xb4\x00\x08\x00\x00\x00\x00\x00\x00\x00\x39\x00\x7d\x00\x0f\x00\x00\x00\x3c\x00\x00\x00\x00\x00\x5f\x00\x00\x00\x4f\x00\x01\x00\x00\x00\x01\x00\x00\x00\x15\x00\xff\xff\x6f\x00\xfd\xff\x24\x00\x26\x00\x00\x00\x00\x00\x7d\x00\x58\x00\x75\x00\x62\x00\x7d\x00\x00\x00\x64\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x5c\x00\x00\x00\x6f\x00\x00\x00\x21\x00\x00\x00\x68\x00\x00\x00\x00\x00\x00\x00\x00\x00\x79\x00\x7b\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x4b\x00\xfd\xff\x00\x00\x00\x00\x00\x00\x00\x00\x5d\x00\x00\x00\x5d\x00\x76\x00\x00\x00\x00\x00\x00\x00\x26\x00\x00\x00\x00\x00\xf9\xff\x00\x00\x00\x00\x77\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"#
happyGotoOffsets :: HappyAddr
happyGotoOffsets = HappyA# "\x35\x00\x87\x00\x3e\x00\x00\x00\x00\x00\x4d\x00\x57\x00\x00\x00\x85\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x8e\x00\x5a\x00\x36\x00\xf4\xff\x00\x00\xf7\x00\x00\x00\x78\x00\x00\x00\x00\x00\x00\x00\x00\x00\xd7\x00\x22\x00\xb6\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x96\x00\x8a\x00\x00\x00\x9e\x00\x00\x00\xce\x00\x8d\x00\xe0\x00\x92\x00\x00\x00\x56\x00\x00\x00\x00\x00\x2f\x00\x00\x00\x00\x01\x00\x00\x04\x01\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xe9\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xf2\x00\x97\x00\x00\x00\x00\x00\x00\x00\x00\x00\xb0\x00\x00\x00\xc2\x00\x00\x00\x00\x00\x00\x00\x00\x00\x5e\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"#
happyAdjustOffset :: Happy_GHC_Exts.Int# -> Happy_GHC_Exts.Int#
happyAdjustOffset off = off
happyDefActions :: HappyAddr
happyDefActions = HappyA# "\xfc\xff\x00\x00\xfa\xff\xfd\xff\x00\x00\xf2\xff\xfa\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xf5\xff\xf6\xff\xf7\xff\xf8\xff\xf4\xff\xf9\xff\xfb\xff\x00\x00\xf2\xff\x00\x00\x00\x00\xf0\xff\xd6\xff\xd4\xff\xd2\xff\xc8\xff\xc5\xff\xc2\xff\xbc\xff\x00\x00\x00\x00\xbd\xff\xca\xff\xc4\xff\xbb\xff\xc9\xff\xf1\xff\xf3\xff\xfc\xff\xed\xff\xef\xff\xed\xff\xea\xff\x00\x00\x00\x00\x00\x00\xd8\xff\xc8\xff\x00\x00\xdd\xff\xfe\xff\x00\x00\x00\x00\xbd\xff\x00\x00\xbd\xff\xbf\xff\x00\x00\xcb\xff\xd3\xff\xd1\xff\xd0\xff\xcf\xff\x00\x00\xd5\xff\x00\x00\xd7\xff\x00\x00\xc7\xff\x00\x00\xc1\xff\xbe\xff\xc3\xff\xc6\xff\x00\x00\xe4\xff\xe3\xff\xe2\xff\xdc\xff\xde\xff\xdb\xff\x00\x00\xd8\xff\xe9\xff\xe0\xff\xe1\xff\xec\xff\xe7\xff\xee\xff\xe7\xff\x00\x00\xdf\xff\xda\xff\xd9\xff\x00\x00\xe6\xff\xc0\xff\x00\x00\xce\xff\xcd\xff\x00\x00\xe5\xff\xeb\xff\xe8\xff\xcc\xff"#
happyCheck :: HappyAddr
happyCheck = HappyA# "\xff\xff\x02\x00\x01\x00\x06\x00\x03\x00\x0c\x00\x1a\x00\x13\x00\x14\x00\x15\x00\x25\x00\x17\x00\x18\x00\x19\x00\x0d\x00\x1b\x00\x01\x00\x10\x00\x15\x00\x12\x00\x1b\x00\x14\x00\x01\x00\x0f\x00\x17\x00\x1a\x00\x1e\x00\x1f\x00\x1b\x00\x1c\x00\x1d\x00\x10\x00\x0b\x00\x12\x00\x0d\x00\x14\x00\x17\x00\x10\x00\x05\x00\x12\x00\x01\x00\x14\x00\x1b\x00\x1c\x00\x17\x00\x0c\x00\x07\x00\x17\x00\x1b\x00\x1c\x00\x1d\x00\x0f\x00\x0d\x00\x00\x00\x01\x00\x10\x00\x14\x00\x12\x00\x01\x00\x19\x00\x16\x00\x1b\x00\x17\x00\x19\x00\x02\x00\x03\x00\x1b\x00\x1c\x00\x1d\x00\x17\x00\x0d\x00\x0e\x00\x19\x00\x10\x00\x1b\x00\x12\x00\x01\x00\x11\x00\x18\x00\x19\x00\x17\x00\x1b\x00\x05\x00\x06\x00\x1b\x00\x1c\x00\x1d\x00\x17\x00\x0d\x00\x02\x00\x03\x00\x10\x00\x17\x00\x12\x00\x01\x00\x05\x00\x06\x00\x18\x00\x17\x00\x0d\x00\x0e\x00\x1a\x00\x1b\x00\x1c\x00\x1d\x00\x1a\x00\x0d\x00\x0d\x00\x0e\x00\x10\x00\x0f\x00\x12\x00\x01\x00\x14\x00\x0e\x00\x1b\x00\x17\x00\x13\x00\x01\x00\x1b\x00\x1b\x00\x1c\x00\x1d\x00\x13\x00\x0d\x00\x04\x00\x01\x00\x10\x00\x05\x00\x12\x00\x0c\x00\x0c\x00\x0f\x00\x10\x00\x17\x00\x12\x00\x01\x00\x04\x00\x1b\x00\x1c\x00\x1d\x00\x10\x00\x16\x00\x12\x00\x1b\x00\x1c\x00\x08\x00\x09\x00\x0a\x00\x07\x00\x0c\x00\x01\x00\x1b\x00\x1c\x00\x10\x00\x11\x00\x0f\x00\x13\x00\x14\x00\x15\x00\xff\xff\x17\x00\x18\x00\x19\x00\x12\x00\x1b\x00\x08\x00\x09\x00\x0a\x00\x12\x00\x0c\x00\xff\xff\xff\xff\xff\xff\x10\x00\x11\x00\xff\xff\x13\x00\x14\x00\x15\x00\xff\xff\x17\x00\x18\x00\x19\x00\xff\xff\x1b\x00\x0a\x00\x0b\x00\x08\x00\x09\x00\x0a\x00\x0b\x00\x10\x00\x11\x00\xff\xff\x13\x00\x14\x00\x15\x00\xff\xff\x17\x00\x18\x00\x19\x00\xff\xff\x1b\x00\x0a\x00\x0b\x00\x18\x00\x19\x00\x1a\x00\x1b\x00\x10\x00\x11\x00\xff\xff\x13\x00\x14\x00\x15\x00\x0a\x00\x17\x00\x18\x00\x19\x00\xff\xff\x1b\x00\x10\x00\x11\x00\xff\xff\x13\x00\x14\x00\x15\x00\xff\xff\x17\x00\x18\x00\x19\x00\xff\xff\x1b\x00\x13\x00\x14\x00\x15\x00\xff\xff\x17\x00\x18\x00\x19\x00\xff\xff\x1b\x00\x13\x00\x14\x00\x15\x00\xff\xff\x17\x00\x18\x00\x19\x00\xff\xff\x1b\x00\x13\x00\x14\x00\x15\x00\xff\xff\x17\x00\x18\x00\x19\x00\xff\xff\x1b\x00\x13\x00\x14\x00\x15\x00\xff\xff\x17\x00\x18\x00\x19\x00\x15\x00\x1b\x00\x17\x00\x18\x00\x19\x00\xff\xff\x1b\x00\x20\x00\x21\x00\x22\x00\x23\x00\x24\x00\x18\x00\x19\x00\x1a\x00\x1b\x00\x18\x00\x19\x00\x1a\x00\x1b\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff"#
happyTable :: HappyAddr
happyTable = HappyA# "\x00\x00\x57\x00\x1f\x00\x53\x00\x33\x00\x66\x00\x04\x00\x17\x00\x18\x00\x19\x00\xff\xff\x1a\x00\x1b\x00\x1c\x00\x20\x00\x1d\x00\x1f\x00\x21\x00\x54\x00\x22\x00\x67\x00\x34\x00\x1f\x00\x36\x00\x23\x00\x58\x00\x16\x00\x17\x00\x24\x00\x25\x00\x26\x00\x21\x00\x5a\x00\x22\x00\x20\x00\x3a\x00\x12\x00\x21\x00\x64\x00\x22\x00\x1f\x00\x34\x00\x24\x00\x25\x00\x23\x00\x65\x00\x44\x00\x11\x00\x24\x00\x25\x00\x26\x00\x36\x00\x20\x00\x04\x00\x02\x00\x21\x00\x51\x00\x22\x00\x1f\x00\x3a\x00\x4f\x00\x1d\x00\x23\x00\x50\x00\x05\x00\x06\x00\x24\x00\x25\x00\x26\x00\x0f\x00\x20\x00\x3d\x00\x4b\x00\x21\x00\x1d\x00\x22\x00\x1f\x00\x37\x00\x26\x00\x1c\x00\x23\x00\x1d\x00\x13\x00\x14\x00\x24\x00\x25\x00\x26\x00\x0e\x00\x20\x00\x12\x00\x06\x00\x21\x00\x0d\x00\x22\x00\x1f\x00\x27\x00\x14\x00\x2a\x00\x23\x00\x4c\x00\x4d\x00\x60\x00\x24\x00\x25\x00\x26\x00\x04\x00\x20\x00\x67\x00\x4d\x00\x21\x00\x36\x00\x22\x00\x1f\x00\x34\x00\x47\x00\x4b\x00\x23\x00\x49\x00\x1f\x00\x46\x00\x24\x00\x25\x00\x26\x00\x63\x00\x20\x00\x62\x00\x1f\x00\x21\x00\x61\x00\x22\x00\x69\x00\x6b\x00\x36\x00\x21\x00\x23\x00\x22\x00\x02\x00\x0f\x00\x24\x00\x25\x00\x26\x00\x21\x00\x3d\x00\x22\x00\x24\x00\x25\x00\x2a\x00\x2b\x00\x2c\x00\x28\x00\x2d\x00\x34\x00\x24\x00\x25\x00\x2e\x00\x2f\x00\x55\x00\x30\x00\x18\x00\x19\x00\x00\x00\x1a\x00\x31\x00\x1c\x00\x51\x00\x1d\x00\x5a\x00\x2b\x00\x2c\x00\x5d\x00\x2d\x00\x00\x00\x00\x00\x00\x00\x2e\x00\x2f\x00\x00\x00\x30\x00\x18\x00\x19\x00\x00\x00\x1a\x00\x31\x00\x1c\x00\x00\x00\x1d\x00\x5b\x00\x5c\x00\x3f\x00\x40\x00\x41\x00\x42\x00\x2e\x00\x2f\x00\x00\x00\x30\x00\x18\x00\x19\x00\x00\x00\x1a\x00\x31\x00\x1c\x00\x00\x00\x1d\x00\x5b\x00\x69\x00\x37\x00\x1c\x00\x38\x00\x1d\x00\x2e\x00\x2f\x00\x00\x00\x30\x00\x18\x00\x19\x00\x58\x00\x1a\x00\x31\x00\x1c\x00\x00\x00\x1d\x00\x2e\x00\x2f\x00\x00\x00\x30\x00\x18\x00\x19\x00\x00\x00\x1a\x00\x31\x00\x1c\x00\x00\x00\x1d\x00\x3b\x00\x18\x00\x19\x00\x00\x00\x1a\x00\x1b\x00\x1c\x00\x00\x00\x1d\x00\x54\x00\x18\x00\x19\x00\x00\x00\x1a\x00\x1b\x00\x1c\x00\x00\x00\x1d\x00\x44\x00\x18\x00\x19\x00\x00\x00\x1a\x00\x1b\x00\x1c\x00\x00\x00\x1d\x00\x5e\x00\x18\x00\x19\x00\x00\x00\x1a\x00\x1b\x00\x1c\x00\x42\x00\x1d\x00\x1a\x00\x1b\x00\x1c\x00\x00\x00\x1d\x00\x08\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x37\x00\x1c\x00\x49\x00\x1d\x00\x37\x00\x1c\x00\x47\x00\x1d\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"#
happyReduceArr = Happy_Data_Array.array (1, 68) [
(1 , happyReduce_1),
(2 , happyReduce_2),
(3 , happyReduce_3),
(4 , happyReduce_4),
(5 , happyReduce_5),
(6 , happyReduce_6),
(7 , happyReduce_7),
(8 , happyReduce_8),
(9 , happyReduce_9),
(10 , happyReduce_10),
(11 , happyReduce_11),
(12 , happyReduce_12),
(13 , happyReduce_13),
(14 , happyReduce_14),
(15 , happyReduce_15),
(16 , happyReduce_16),
(17 , happyReduce_17),
(18 , happyReduce_18),
(19 , happyReduce_19),
(20 , happyReduce_20),
(21 , happyReduce_21),
(22 , happyReduce_22),
(23 , happyReduce_23),
(24 , happyReduce_24),
(25 , happyReduce_25),
(26 , happyReduce_26),
(27 , happyReduce_27),
(28 , happyReduce_28),
(29 , happyReduce_29),
(30 , happyReduce_30),
(31 , happyReduce_31),
(32 , happyReduce_32),
(33 , happyReduce_33),
(34 , happyReduce_34),
(35 , happyReduce_35),
(36 , happyReduce_36),
(37 , happyReduce_37),
(38 , happyReduce_38),
(39 , happyReduce_39),
(40 , happyReduce_40),
(41 , happyReduce_41),
(42 , happyReduce_42),
(43 , happyReduce_43),
(44 , happyReduce_44),
(45 , happyReduce_45),
(46 , happyReduce_46),
(47 , happyReduce_47),
(48 , happyReduce_48),
(49 , happyReduce_49),
(50 , happyReduce_50),
(51 , happyReduce_51),
(52 , happyReduce_52),
(53 , happyReduce_53),
(54 , happyReduce_54),
(55 , happyReduce_55),
(56 , happyReduce_56),
(57 , happyReduce_57),
(58 , happyReduce_58),
(59 , happyReduce_59),
(60 , happyReduce_60),
(61 , happyReduce_61),
(62 , happyReduce_62),
(63 , happyReduce_63),
(64 , happyReduce_64),
(65 , happyReduce_65),
(66 , happyReduce_66),
(67 , happyReduce_67),
(68 , happyReduce_68)
]
happy_n_terms = 38 :: Int
happy_n_nonterms = 28 :: Int
happyReduce_1 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_1 = happyReduce 5# 0# happyReduction_1
happyReduction_1 (happy_x_5 `HappyStk`
happy_x_4 `HappyStk`
happy_x_3 `HappyStk`
happy_x_2 `HappyStk`
happy_x_1 `HappyStk`
happyRest)
= case happyOut5 happy_x_1 of { (HappyWrap5 happy_var_1) ->
case happyOut6 happy_x_2 of { (HappyWrap6 happy_var_2) ->
case happyOut11 happy_x_4 of { (HappyWrap11 happy_var_4) ->
case happyOut5 happy_x_5 of { (HappyWrap5 happy_var_5) ->
happyIn4
((happy_var_1,happy_var_2,happy_var_4,happy_var_5)
) `HappyStk` happyRest}}}}
happyReduce_2 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_2 = happySpecReduce_1 1# happyReduction_2
happyReduction_2 happy_x_1
= case happyOutTok happy_x_1 of { happy_var_1 ->
happyIn5
(case happy_var_1 of T pos (CodeT code) ->
Just (pos,code)
)}
happyReduce_3 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_3 = happySpecReduce_0 1# happyReduction_3
happyReduction_3 = happyIn5
(Nothing
)
happyReduce_4 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_4 = happySpecReduce_2 2# happyReduction_4
happyReduction_4 happy_x_2
happy_x_1
= case happyOut7 happy_x_1 of { (HappyWrap7 happy_var_1) ->
case happyOut6 happy_x_2 of { (HappyWrap6 happy_var_2) ->
happyIn6
(happy_var_1 : happy_var_2
)}}
happyReduce_5 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_5 = happySpecReduce_0 2# happyReduction_5
happyReduction_5 = happyIn6
([]
)
happyReduce_6 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_6 = happySpecReduce_2 3# happyReduction_6
happyReduction_6 happy_x_2
happy_x_1
= case happyOutTok happy_x_2 of { (T _ (StringT happy_var_2)) ->
happyIn7
(WrapperDirective happy_var_2
)}
happyReduce_7 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_7 = happySpecReduce_2 3# happyReduction_7
happyReduction_7 happy_x_2
happy_x_1
= case happyOut8 happy_x_2 of { (HappyWrap8 happy_var_2) ->
happyIn7
(EncodingDirective happy_var_2
)}
happyReduce_8 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_8 = happySpecReduce_2 3# happyReduction_8
happyReduction_8 happy_x_2
happy_x_1
= case happyOutTok happy_x_2 of { (T _ (StringT happy_var_2)) ->
happyIn7
(ActionType happy_var_2
)}
happyReduce_9 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_9 = happySpecReduce_2 3# happyReduction_9
happyReduction_9 happy_x_2
happy_x_1
= case happyOutTok happy_x_2 of { (T _ (StringT happy_var_2)) ->
happyIn7
(TokenType happy_var_2
)}
happyReduce_10 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_10 = happySpecReduce_2 3# happyReduction_10
happyReduction_10 happy_x_2
happy_x_1
= case happyOutTok happy_x_2 of { (T _ (StringT happy_var_2)) ->
happyIn7
(TypeClass happy_var_2
)}
happyReduce_11 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_11 = happyMonadReduce 1# 4# happyReduction_11
happyReduction_11 (happy_x_1 `HappyStk`
happyRest) tk
= happyThen ((case happyOutTok happy_x_1 of { (T _ (StringT happy_var_1)) ->
( lookupEncoding happy_var_1)})
) (\r -> happyReturn (happyIn8 r))
happyReduce_12 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_12 = happySpecReduce_2 5# happyReduction_12
happyReduction_12 happy_x_2
happy_x_1
= happyIn9
(()
)
happyReduce_13 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_13 = happySpecReduce_0 5# happyReduction_13
happyReduction_13 = happyIn9
(()
)
happyReduce_14 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_14 = happyMonadReduce 2# 6# happyReduction_14
happyReduction_14 (happy_x_2 `HappyStk`
happy_x_1 `HappyStk`
happyRest) tk
= happyThen ((case happyOutTok happy_x_1 of { (T _ (SMacDefT happy_var_1)) ->
case happyOut28 happy_x_2 of { (HappyWrap28 happy_var_2) ->
( newSMac happy_var_1 happy_var_2)}})
) (\r -> happyReturn (happyIn10 r))
happyReduce_15 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_15 = happyMonadReduce 2# 6# happyReduction_15
happyReduction_15 (happy_x_2 `HappyStk`
happy_x_1 `HappyStk`
happyRest) tk
= happyThen ((case happyOutTok happy_x_1 of { (T _ (RMacDefT happy_var_1)) ->
case happyOut23 happy_x_2 of { (HappyWrap23 happy_var_2) ->
( newRMac happy_var_1 happy_var_2)}})
) (\r -> happyReturn (happyIn10 r))
happyReduce_16 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_16 = happySpecReduce_2 7# happyReduction_16
happyReduction_16 happy_x_2
happy_x_1
= case happyOutTok happy_x_1 of { (T _ (BindT happy_var_1)) ->
case happyOut12 happy_x_2 of { (HappyWrap12 happy_var_2) ->
happyIn11
(Scanner happy_var_1 happy_var_2
)}}
happyReduce_17 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_17 = happySpecReduce_2 8# happyReduction_17
happyReduction_17 happy_x_2
happy_x_1
= case happyOut13 happy_x_1 of { (HappyWrap13 happy_var_1) ->
case happyOut12 happy_x_2 of { (HappyWrap12 happy_var_2) ->
happyIn12
(happy_var_1 ++ happy_var_2
)}}
happyReduce_18 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_18 = happySpecReduce_0 8# happyReduction_18
happyReduction_18 = happyIn12
([]
)
happyReduce_19 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_19 = happySpecReduce_2 9# happyReduction_19
happyReduction_19 happy_x_2
happy_x_1
= case happyOut16 happy_x_1 of { (HappyWrap16 happy_var_1) ->
case happyOut14 happy_x_2 of { (HappyWrap14 happy_var_2) ->
happyIn13
([ replaceCodes happy_var_1 happy_var_2 ]
)}}
happyReduce_20 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_20 = happyReduce 4# 9# happyReduction_20
happyReduction_20 (happy_x_4 `HappyStk`
happy_x_3 `HappyStk`
happy_x_2 `HappyStk`
happy_x_1 `HappyStk`
happyRest)
= case happyOut16 happy_x_1 of { (HappyWrap16 happy_var_1) ->
case happyOut15 happy_x_3 of { (HappyWrap15 happy_var_3) ->
happyIn13
(map (replaceCodes happy_var_1) happy_var_3
) `HappyStk` happyRest}}
happyReduce_21 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_21 = happySpecReduce_1 9# happyReduction_21
happyReduction_21 happy_x_1
= case happyOut14 happy_x_1 of { (HappyWrap14 happy_var_1) ->
happyIn13
([ happy_var_1 ]
)}
happyReduce_22 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_22 = happySpecReduce_2 10# happyReduction_22
happyReduction_22 happy_x_2
happy_x_1
= case happyOut20 happy_x_1 of { (HappyWrap20 happy_var_1) ->
case happyOut19 happy_x_2 of { (HappyWrap19 happy_var_2) ->
happyIn14
(let (l,e,r) = happy_var_1 in
RECtx [] l e r happy_var_2
)}}
happyReduce_23 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_23 = happySpecReduce_2 11# happyReduction_23
happyReduction_23 happy_x_2
happy_x_1
= case happyOut14 happy_x_1 of { (HappyWrap14 happy_var_1) ->
case happyOut15 happy_x_2 of { (HappyWrap15 happy_var_2) ->
happyIn15
(happy_var_1 : happy_var_2
)}}
happyReduce_24 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_24 = happySpecReduce_0 11# happyReduction_24
happyReduction_24 = happyIn15
([]
)
happyReduce_25 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_25 = happySpecReduce_3 12# happyReduction_25
happyReduction_25 happy_x_3
happy_x_2
happy_x_1
= case happyOut17 happy_x_2 of { (HappyWrap17 happy_var_2) ->
happyIn16
(happy_var_2
)}
happyReduce_26 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_26 = happySpecReduce_3 13# happyReduction_26
happyReduction_26 happy_x_3
happy_x_2
happy_x_1
= case happyOut18 happy_x_1 of { (HappyWrap18 happy_var_1) ->
case happyOut17 happy_x_3 of { (HappyWrap17 happy_var_3) ->
happyIn17
((happy_var_1,0) : happy_var_3
)}}
happyReduce_27 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_27 = happySpecReduce_1 13# happyReduction_27
happyReduction_27 happy_x_1
= case happyOut18 happy_x_1 of { (HappyWrap18 happy_var_1) ->
happyIn17
([(happy_var_1,0)]
)}
happyReduce_28 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_28 = happySpecReduce_1 14# happyReduction_28
happyReduction_28 happy_x_1
= happyIn18
("0"
)
happyReduce_29 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_29 = happySpecReduce_1 14# happyReduction_29
happyReduction_29 happy_x_1
= case happyOutTok happy_x_1 of { (T _ (IdT happy_var_1)) ->
happyIn18
(happy_var_1
)}
happyReduce_30 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_30 = happySpecReduce_1 15# happyReduction_30
happyReduction_30 happy_x_1
= case happyOutTok happy_x_1 of { happy_var_1 ->
happyIn19
(case happy_var_1 of T _ (CodeT code) -> Just code
)}
happyReduce_31 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_31 = happySpecReduce_1 15# happyReduction_31
happyReduction_31 happy_x_1
= happyIn19
(Nothing
)
happyReduce_32 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_32 = happySpecReduce_3 16# happyReduction_32
happyReduction_32 happy_x_3
happy_x_2
happy_x_1
= case happyOut21 happy_x_1 of { (HappyWrap21 happy_var_1) ->
case happyOut23 happy_x_2 of { (HappyWrap23 happy_var_2) ->
case happyOut22 happy_x_3 of { (HappyWrap22 happy_var_3) ->
happyIn20
((Just happy_var_1,happy_var_2,happy_var_3)
)}}}
happyReduce_33 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_33 = happySpecReduce_2 16# happyReduction_33
happyReduction_33 happy_x_2
happy_x_1
= case happyOut23 happy_x_1 of { (HappyWrap23 happy_var_1) ->
case happyOut22 happy_x_2 of { (HappyWrap22 happy_var_2) ->
happyIn20
((Nothing,happy_var_1,happy_var_2)
)}}
happyReduce_34 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_34 = happySpecReduce_1 17# happyReduction_34
happyReduction_34 happy_x_1
= happyIn21
(charSetSingleton '\n'
)
happyReduce_35 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_35 = happySpecReduce_2 17# happyReduction_35
happyReduction_35 happy_x_2
happy_x_1
= case happyOut28 happy_x_1 of { (HappyWrap28 happy_var_1) ->
happyIn21
(happy_var_1
)}
happyReduce_36 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_36 = happySpecReduce_1 18# happyReduction_36
happyReduction_36 happy_x_1
= happyIn22
(RightContextRExp (Ch (charSetSingleton '\n'))
)
happyReduce_37 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_37 = happySpecReduce_2 18# happyReduction_37
happyReduction_37 happy_x_2
happy_x_1
= case happyOut23 happy_x_2 of { (HappyWrap23 happy_var_2) ->
happyIn22
(RightContextRExp happy_var_2
)}
happyReduce_38 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_38 = happySpecReduce_2 18# happyReduction_38
happyReduction_38 happy_x_2
happy_x_1
= case happyOutTok happy_x_2 of { happy_var_2 ->
happyIn22
(RightContextCode (case happy_var_2 of
T _ (CodeT code) -> code)
)}
happyReduce_39 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_39 = happySpecReduce_0 18# happyReduction_39
happyReduction_39 = happyIn22
(NoRightContext
)
happyReduce_40 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_40 = happySpecReduce_3 19# happyReduction_40
happyReduction_40 happy_x_3
happy_x_2
happy_x_1
= case happyOut24 happy_x_1 of { (HappyWrap24 happy_var_1) ->
case happyOut23 happy_x_3 of { (HappyWrap23 happy_var_3) ->
happyIn23
(happy_var_1 :| happy_var_3
)}}
happyReduce_41 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_41 = happySpecReduce_1 19# happyReduction_41
happyReduction_41 happy_x_1
= case happyOut24 happy_x_1 of { (HappyWrap24 happy_var_1) ->
happyIn23
(happy_var_1
)}
happyReduce_42 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_42 = happySpecReduce_2 20# happyReduction_42
happyReduction_42 happy_x_2
happy_x_1
= case happyOut24 happy_x_1 of { (HappyWrap24 happy_var_1) ->
case happyOut25 happy_x_2 of { (HappyWrap25 happy_var_2) ->
happyIn24
(happy_var_1 :%% happy_var_2
)}}
happyReduce_43 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_43 = happySpecReduce_1 20# happyReduction_43
happyReduction_43 happy_x_1
= case happyOut25 happy_x_1 of { (HappyWrap25 happy_var_1) ->
happyIn24
(happy_var_1
)}
happyReduce_44 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_44 = happySpecReduce_2 21# happyReduction_44
happyReduction_44 happy_x_2
happy_x_1
= case happyOut27 happy_x_1 of { (HappyWrap27 happy_var_1) ->
case happyOut26 happy_x_2 of { (HappyWrap26 happy_var_2) ->
happyIn25
(happy_var_2 happy_var_1
)}}
happyReduce_45 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_45 = happySpecReduce_1 21# happyReduction_45
happyReduction_45 happy_x_1
= case happyOut27 happy_x_1 of { (HappyWrap27 happy_var_1) ->
happyIn25
(happy_var_1
)}
happyReduce_46 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_46 = happySpecReduce_1 22# happyReduction_46
happyReduction_46 happy_x_1
= happyIn26
(Star
)
happyReduce_47 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_47 = happySpecReduce_1 22# happyReduction_47
happyReduction_47 happy_x_1
= happyIn26
(Plus
)
happyReduce_48 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_48 = happySpecReduce_1 22# happyReduction_48
happyReduction_48 happy_x_1
= happyIn26
(Ques
)
happyReduce_49 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_49 = happySpecReduce_3 22# happyReduction_49
happyReduction_49 happy_x_3
happy_x_2
happy_x_1
= case happyOutTok happy_x_2 of { (T _ (CharT happy_var_2)) ->
happyIn26
(repeat_rng (digit happy_var_2) Nothing
)}
happyReduce_50 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_50 = happyReduce 4# 22# happyReduction_50
happyReduction_50 (happy_x_4 `HappyStk`
happy_x_3 `HappyStk`
happy_x_2 `HappyStk`
happy_x_1 `HappyStk`
happyRest)
= case happyOutTok happy_x_2 of { (T _ (CharT happy_var_2)) ->
happyIn26
(repeat_rng (digit happy_var_2) (Just Nothing)
) `HappyStk` happyRest}
happyReduce_51 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_51 = happyReduce 5# 22# happyReduction_51
happyReduction_51 (happy_x_5 `HappyStk`
happy_x_4 `HappyStk`
happy_x_3 `HappyStk`
happy_x_2 `HappyStk`
happy_x_1 `HappyStk`
happyRest)
= case happyOutTok happy_x_2 of { (T _ (CharT happy_var_2)) ->
case happyOutTok happy_x_4 of { (T _ (CharT happy_var_4)) ->
happyIn26
(repeat_rng (digit happy_var_2) (Just (Just (digit happy_var_4)))
) `HappyStk` happyRest}}
happyReduce_52 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_52 = happySpecReduce_2 23# happyReduction_52
happyReduction_52 happy_x_2
happy_x_1
= happyIn27
(Eps
)
happyReduce_53 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_53 = happySpecReduce_1 23# happyReduction_53
happyReduction_53 happy_x_1
= case happyOutTok happy_x_1 of { (T _ (StringT happy_var_1)) ->
happyIn27
(foldr (:%%) Eps
(map (Ch . charSetSingleton) happy_var_1)
)}
happyReduce_54 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_54 = happyMonadReduce 1# 23# happyReduction_54
happyReduction_54 (happy_x_1 `HappyStk`
happyRest) tk
= happyThen ((case happyOutTok happy_x_1 of { (T _ (RMacT happy_var_1)) ->
( lookupRMac happy_var_1)})
) (\r -> happyReturn (happyIn27 r))
happyReduce_55 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_55 = happySpecReduce_1 23# happyReduction_55
happyReduction_55 happy_x_1
= case happyOut28 happy_x_1 of { (HappyWrap28 happy_var_1) ->
happyIn27
(Ch happy_var_1
)}
happyReduce_56 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_56 = happySpecReduce_3 23# happyReduction_56
happyReduction_56 happy_x_3
happy_x_2
happy_x_1
= case happyOut23 happy_x_2 of { (HappyWrap23 happy_var_2) ->
happyIn27
(happy_var_2
)}
happyReduce_57 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_57 = happySpecReduce_3 24# happyReduction_57
happyReduction_57 happy_x_3
happy_x_2
happy_x_1
= case happyOut28 happy_x_1 of { (HappyWrap28 happy_var_1) ->
case happyOut29 happy_x_3 of { (HappyWrap29 happy_var_3) ->
happyIn28
(happy_var_1 `charSetMinus` happy_var_3
)}}
happyReduce_58 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_58 = happySpecReduce_1 24# happyReduction_58
happyReduction_58 happy_x_1
= case happyOut29 happy_x_1 of { (HappyWrap29 happy_var_1) ->
happyIn28
(happy_var_1
)}
happyReduce_59 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_59 = happySpecReduce_1 25# happyReduction_59
happyReduction_59 happy_x_1
= case happyOutTok happy_x_1 of { (T _ (CharT happy_var_1)) ->
happyIn29
(charSetSingleton happy_var_1
)}
happyReduce_60 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_60 = happySpecReduce_3 25# happyReduction_60
happyReduction_60 happy_x_3
happy_x_2
happy_x_1
= case happyOutTok happy_x_1 of { (T _ (CharT happy_var_1)) ->
case happyOutTok happy_x_3 of { (T _ (CharT happy_var_3)) ->
happyIn29
(charSetRange happy_var_1 happy_var_3
)}}
happyReduce_61 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_61 = happyMonadReduce 1# 25# happyReduction_61
happyReduction_61 (happy_x_1 `HappyStk`
happyRest) tk
= happyThen ((case happyOut31 happy_x_1 of { (HappyWrap31 happy_var_1) ->
( lookupSMac happy_var_1)})
) (\r -> happyReturn (happyIn29 r))
happyReduce_62 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_62 = happySpecReduce_3 25# happyReduction_62
happyReduction_62 happy_x_3
happy_x_2
happy_x_1
= case happyOut30 happy_x_2 of { (HappyWrap30 happy_var_2) ->
happyIn29
(foldr charSetUnion emptyCharSet happy_var_2
)}
happyReduce_63 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_63 = happyMonadReduce 4# 25# happyReduction_63
happyReduction_63 (happy_x_4 `HappyStk`
happy_x_3 `HappyStk`
happy_x_2 `HappyStk`
happy_x_1 `HappyStk`
happyRest) tk
= happyThen ((case happyOutTok happy_x_1 of { happy_var_1 ->
case happyOut30 happy_x_3 of { (HappyWrap30 happy_var_3) ->
( do { dot <- lookupSMac (tokPosn happy_var_1, ".");
return (dot `charSetMinus`
foldr charSetUnion emptyCharSet happy_var_3) })}})
) (\r -> happyReturn (happyIn29 r))
happyReduce_64 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_64 = happyMonadReduce 2# 25# happyReduction_64
happyReduction_64 (happy_x_2 `HappyStk`
happy_x_1 `HappyStk`
happyRest) tk
= happyThen ((case happyOutTok happy_x_1 of { happy_var_1 ->
case happyOut29 happy_x_2 of { (HappyWrap29 happy_var_2) ->
( do { dot <- lookupSMac (tokPosn happy_var_1, ".");
return (dot `charSetMinus` happy_var_2) })}})
) (\r -> happyReturn (happyIn29 r))
happyReduce_65 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_65 = happySpecReduce_2 26# happyReduction_65
happyReduction_65 happy_x_2
happy_x_1
= case happyOut28 happy_x_1 of { (HappyWrap28 happy_var_1) ->
case happyOut30 happy_x_2 of { (HappyWrap30 happy_var_2) ->
happyIn30
(happy_var_1 : happy_var_2
)}}
happyReduce_66 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_66 = happySpecReduce_0 26# happyReduction_66
happyReduction_66 = happyIn30
([]
)
happyReduce_67 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_67 = happySpecReduce_1 27# happyReduction_67
happyReduction_67 happy_x_1
= case happyOutTok happy_x_1 of { happy_var_1 ->
happyIn31
((tokPosn happy_var_1, ".")
)}
happyReduce_68 :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduce_68 = happySpecReduce_1 27# happyReduction_68
happyReduction_68 happy_x_1
= case happyOutTok happy_x_1 of { happy_var_1 ->
happyIn31
(case happy_var_1 of T p (SMacT s) -> (p, s)
)}
happyNewToken action sts stk
= lexer(\tk ->
let cont i = happyDoAction i tk action sts stk in
case tk of {
T _ EOFT -> happyDoAction 37# tk action sts stk;
T _ (SpecialT '.') -> cont 1#;
T _ (SpecialT ';') -> cont 2#;
T _ (SpecialT '<') -> cont 3#;
T _ (SpecialT '>') -> cont 4#;
T _ (SpecialT ',') -> cont 5#;
T _ (SpecialT '$') -> cont 6#;
T _ (SpecialT '|') -> cont 7#;
T _ (SpecialT '*') -> cont 8#;
T _ (SpecialT '+') -> cont 9#;
T _ (SpecialT '?') -> cont 10#;
T _ (SpecialT '{') -> cont 11#;
T _ (SpecialT '}') -> cont 12#;
T _ (SpecialT '(') -> cont 13#;
T _ (SpecialT ')') -> cont 14#;
T _ (SpecialT '#') -> cont 15#;
T _ (SpecialT '~') -> cont 16#;
T _ (SpecialT '-') -> cont 17#;
T _ (SpecialT '[') -> cont 18#;
T _ (SpecialT ']') -> cont 19#;
T _ (SpecialT '^') -> cont 20#;
T _ (SpecialT '/') -> cont 21#;
T _ ZeroT -> cont 22#;
T _ (StringT happy_dollar_dollar) -> cont 23#;
T _ (BindT happy_dollar_dollar) -> cont 24#;
T _ (IdT happy_dollar_dollar) -> cont 25#;
T _ (CodeT _) -> cont 26#;
T _ (CharT happy_dollar_dollar) -> cont 27#;
T _ (SMacT _) -> cont 28#;
T _ (RMacT happy_dollar_dollar) -> cont 29#;
T _ (SMacDefT happy_dollar_dollar) -> cont 30#;
T _ (RMacDefT happy_dollar_dollar) -> cont 31#;
T _ WrapperT -> cont 32#;
T _ EncodingT -> cont 33#;
T _ ActionTypeT -> cont 34#;
T _ TokenTypeT -> cont 35#;
T _ TypeClassT -> cont 36#;
_ -> happyError' (tk, [])
})
happyError_ explist 37# tk = happyError' (tk, explist)
happyError_ explist _ tk = happyError' (tk, explist)
happyThen :: () => P a -> (a -> P b) -> P b
happyThen = ((>>=))
happyReturn :: () => a -> P a
happyReturn = (return)
happyParse :: () => Happy_GHC_Exts.Int# -> P (HappyAbsSyn )
happyNewToken :: () => Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyDoAction :: () => Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn )
happyReduceArr :: () => Happy_Data_Array.Array Int (Happy_GHC_Exts.Int# -> Token -> Happy_GHC_Exts.Int# -> Happy_IntList -> HappyStk (HappyAbsSyn ) -> P (HappyAbsSyn ))
happyThen1 :: () => P a -> (a -> P b) -> P b
happyThen1 = happyThen
happyReturn1 :: () => a -> P a
happyReturn1 = happyReturn
happyError' :: () => ((Token), [String]) -> P a
happyError' tk = (\(tokens, explist) -> happyError) tk
parse = happySomeParser where
happySomeParser = happyThen (happyParse 0#) (\x -> happyReturn (let {(HappyWrap4 x') = happyOut4 x} in x'))
happySeq = happyDontSeq
happyError :: P a
happyError = failP "parse error"
-- -----------------------------------------------------------------------------
-- Utils
digit c = ord c - ord '0'
repeat_rng :: Int -> Maybe (Maybe Int) -> (RExp->RExp)
repeat_rng n (Nothing) re = foldr (:%%) Eps (replicate n re)
repeat_rng n (Just Nothing) re = foldr (:%%) (Star re) (replicate n re)
repeat_rng n (Just (Just m)) re = intl :%% rst
where
intl = repeat_rng n Nothing re
rst = foldr (\re re'->Ques(re :%% re')) Eps (replicate (m-n) re)
replaceCodes codes rectx = rectx{ reCtxStartCodes = codes }
lookupEncoding :: String -> P Encoding
lookupEncoding s = case map toLower s of
"iso-8859-1" -> return Latin1
"latin1" -> return Latin1
"utf-8" -> return UTF8
"utf8" -> return UTF8
_ -> failP ("encoding " ++ show s ++ " not supported")
{-# LINE 1 "templates/GenericTemplate.hs" #-}
{-# LINE 1 "templates/GenericTemplate.hs" #-}
{-# LINE 1 "" #-}
{-# LINE 1 "" #-}
{-# LINE 10 "" #-}
# 1 "/usr/include/stdc-predef.h" 1 3 4
# 17 "/usr/include/stdc-predef.h" 3 4
{-# LINE 10 "" #-}
{-# LINE 1 "/home/smarlow/local/lib/ghc-8.4.3/include/ghcversion.h" #-}
{-# LINE 10 "" #-}
{-# LINE 1 "/tmp/ghc27342_0/ghc_2.h" #-}
{-# LINE 10 "" #-}
{-# LINE 1 "templates/GenericTemplate.hs" #-}
-- Id: GenericTemplate.hs,v 1.26 2005/01/14 14:47:22 simonmar Exp
-- Do not remove this comment. Required to fix CPP parsing when using GCC and a clang-compiled alex.
#if __GLASGOW_HASKELL__ > 706
#define LT(n,m) ((Happy_GHC_Exts.tagToEnum# (n Happy_GHC_Exts.<# m)) :: Bool)
#define GTE(n,m) ((Happy_GHC_Exts.tagToEnum# (n Happy_GHC_Exts.>=# m)) :: Bool)
#define EQ(n,m) ((Happy_GHC_Exts.tagToEnum# (n Happy_GHC_Exts.==# m)) :: Bool)
#else
#define LT(n,m) (n Happy_GHC_Exts.<# m)
#define GTE(n,m) (n Happy_GHC_Exts.>=# m)
#define EQ(n,m) (n Happy_GHC_Exts.==# m)
#endif
{-# LINE 43 "templates/GenericTemplate.hs" #-}
data Happy_IntList = HappyCons Happy_GHC_Exts.Int# Happy_IntList
{-# LINE 65 "templates/GenericTemplate.hs" #-}
{-# LINE 75 "templates/GenericTemplate.hs" #-}
{-# LINE 84 "templates/GenericTemplate.hs" #-}
infixr 9 `HappyStk`
data HappyStk a = HappyStk a (HappyStk a)
-----------------------------------------------------------------------------
-- starting the parse
happyParse start_state = happyNewToken start_state notHappyAtAll notHappyAtAll
-----------------------------------------------------------------------------
-- Accepting the parse
-- If the current token is 0#, it means we've just accepted a partial
-- parse (a %partial parser). We must ignore the saved token on the top of
-- the stack in this case.
happyAccept 0# tk st sts (_ `HappyStk` ans `HappyStk` _) =
happyReturn1 ans
happyAccept j tk st sts (HappyStk ans _) =
(happyTcHack j (happyTcHack st)) (happyReturn1 ans)
-----------------------------------------------------------------------------
-- Arrays only: do the next action
happyDoAction i tk st
= {- nothing -}
case action of
0# -> {- nothing -}
happyFail (happyExpListPerState ((Happy_GHC_Exts.I# (st)) :: Int)) i tk st
-1# -> {- nothing -}
happyAccept i tk st
n | LT(n,(0# :: Happy_GHC_Exts.Int#)) -> {- nothing -}
(happyReduceArr Happy_Data_Array.! rule) i tk st
where rule = (Happy_GHC_Exts.I# ((Happy_GHC_Exts.negateInt# ((n Happy_GHC_Exts.+# (1# :: Happy_GHC_Exts.Int#))))))
n -> {- nothing -}
happyShift new_state i tk st
where new_state = (n Happy_GHC_Exts.-# (1# :: Happy_GHC_Exts.Int#))
where off = happyAdjustOffset (indexShortOffAddr happyActOffsets st)
off_i = (off Happy_GHC_Exts.+# i)
check = if GTE(off_i,(0# :: Happy_GHC_Exts.Int#))
then EQ(indexShortOffAddr happyCheck off_i, i)
else False
action
| check = indexShortOffAddr happyTable off_i
| otherwise = indexShortOffAddr happyDefActions st
indexShortOffAddr (HappyA# arr) off =
Happy_GHC_Exts.narrow16Int# i
where
i = Happy_GHC_Exts.word2Int# (Happy_GHC_Exts.or# (Happy_GHC_Exts.uncheckedShiftL# high 8#) low)
high = Happy_GHC_Exts.int2Word# (Happy_GHC_Exts.ord# (Happy_GHC_Exts.indexCharOffAddr# arr (off' Happy_GHC_Exts.+# 1#)))
low = Happy_GHC_Exts.int2Word# (Happy_GHC_Exts.ord# (Happy_GHC_Exts.indexCharOffAddr# arr off'))
off' = off Happy_GHC_Exts.*# 2#
{-# INLINE happyLt #-}
happyLt x y = LT(x,y)
readArrayBit arr bit =
Bits.testBit (Happy_GHC_Exts.I# (indexShortOffAddr arr ((unbox_int bit) `Happy_GHC_Exts.iShiftRA#` 4#))) (bit `mod` 16)
where unbox_int (Happy_GHC_Exts.I# x) = x
data HappyAddr = HappyA# Happy_GHC_Exts.Addr#
-----------------------------------------------------------------------------
-- HappyState data type (not arrays)
{-# LINE 180 "templates/GenericTemplate.hs" #-}
-----------------------------------------------------------------------------
-- Shifting a token
happyShift new_state 0# tk st sts stk@(x `HappyStk` _) =
let i = (case Happy_GHC_Exts.unsafeCoerce# x of { (Happy_GHC_Exts.I# (i)) -> i }) in
-- trace "shifting the error token" $
happyDoAction i tk new_state (HappyCons (st) (sts)) (stk)
happyShift new_state i tk st sts stk =
happyNewToken new_state (HappyCons (st) (sts)) ((happyInTok (tk))`HappyStk`stk)
-- happyReduce is specialised for the common cases.
happySpecReduce_0 i fn 0# tk st sts stk
= happyFail [] 0# tk st sts stk
happySpecReduce_0 nt fn j tk st@((action)) sts stk
= happyGoto nt j tk st (HappyCons (st) (sts)) (fn `HappyStk` stk)
happySpecReduce_1 i fn 0# tk st sts stk
= happyFail [] 0# tk st sts stk
happySpecReduce_1 nt fn j tk _ sts@((HappyCons (st@(action)) (_))) (v1`HappyStk`stk')
= let r = fn v1 in
happySeq r (happyGoto nt j tk st sts (r `HappyStk` stk'))
happySpecReduce_2 i fn 0# tk st sts stk
= happyFail [] 0# tk st sts stk
happySpecReduce_2 nt fn j tk _ (HappyCons (_) (sts@((HappyCons (st@(action)) (_))))) (v1`HappyStk`v2`HappyStk`stk')
= let r = fn v1 v2 in
happySeq r (happyGoto nt j tk st sts (r `HappyStk` stk'))
happySpecReduce_3 i fn 0# tk st sts stk
= happyFail [] 0# tk st sts stk
happySpecReduce_3 nt fn j tk _ (HappyCons (_) ((HappyCons (_) (sts@((HappyCons (st@(action)) (_))))))) (v1`HappyStk`v2`HappyStk`v3`HappyStk`stk')
= let r = fn v1 v2 v3 in
happySeq r (happyGoto nt j tk st sts (r `HappyStk` stk'))
happyReduce k i fn 0# tk st sts stk
= happyFail [] 0# tk st sts stk
happyReduce k nt fn j tk st sts stk
= case happyDrop (k Happy_GHC_Exts.-# (1# :: Happy_GHC_Exts.Int#)) sts of
sts1@((HappyCons (st1@(action)) (_))) ->
let r = fn stk in -- it doesn't hurt to always seq here...
happyDoSeq r (happyGoto nt j tk st1 sts1 r)
happyMonadReduce k nt fn 0# tk st sts stk
= happyFail [] 0# tk st sts stk
happyMonadReduce k nt fn j tk st sts stk =
case happyDrop k (HappyCons (st) (sts)) of
sts1@((HappyCons (st1@(action)) (_))) ->
let drop_stk = happyDropStk k stk in
happyThen1 (fn stk tk) (\r -> happyGoto nt j tk st1 sts1 (r `HappyStk` drop_stk))
happyMonad2Reduce k nt fn 0# tk st sts stk
= happyFail [] 0# tk st sts stk
happyMonad2Reduce k nt fn j tk st sts stk =
case happyDrop k (HappyCons (st) (sts)) of
sts1@((HappyCons (st1@(action)) (_))) ->
let drop_stk = happyDropStk k stk
off = happyAdjustOffset (indexShortOffAddr happyGotoOffsets st1)
off_i = (off Happy_GHC_Exts.+# nt)
new_state = indexShortOffAddr happyTable off_i
in
happyThen1 (fn stk tk) (\r -> happyNewToken new_state sts1 (r `HappyStk` drop_stk))
happyDrop 0# l = l
happyDrop n (HappyCons (_) (t)) = happyDrop (n Happy_GHC_Exts.-# (1# :: Happy_GHC_Exts.Int#)) t
happyDropStk 0# l = l
happyDropStk n (x `HappyStk` xs) = happyDropStk (n Happy_GHC_Exts.-# (1#::Happy_GHC_Exts.Int#)) xs
-----------------------------------------------------------------------------
-- Moving to a new state after a reduction
happyGoto nt j tk st =
{- nothing -}
happyDoAction j tk new_state
where off = happyAdjustOffset (indexShortOffAddr happyGotoOffsets st)
off_i = (off Happy_GHC_Exts.+# nt)
new_state = indexShortOffAddr happyTable off_i
-----------------------------------------------------------------------------
-- Error recovery (0# is the error token)
-- parse error if we are in recovery and we fail again
happyFail explist 0# tk old_st _ stk@(x `HappyStk` _) =
let i = (case Happy_GHC_Exts.unsafeCoerce# x of { (Happy_GHC_Exts.I# (i)) -> i }) in
-- trace "failing" $
happyError_ explist i tk
{- We don't need state discarding for our restricted implementation of
"error". In fact, it can cause some bogus parses, so I've disabled it
for now --SDM
-- discard a state
happyFail 0# tk old_st (HappyCons ((action)) (sts))
(saved_tok `HappyStk` _ `HappyStk` stk) =
-- trace ("discarding state, depth " ++ show (length stk)) $
happyDoAction 0# tk action sts ((saved_tok`HappyStk`stk))
-}
-- Enter error recovery: generate an error token,
-- save the old token and carry on.
happyFail explist i tk (action) sts stk =
-- trace "entering error recovery" $
happyDoAction 0# tk action sts ( (Happy_GHC_Exts.unsafeCoerce# (Happy_GHC_Exts.I# (i))) `HappyStk` stk)
-- Internal happy errors:
notHappyAtAll :: a
notHappyAtAll = error "Internal Happy error\n"
-----------------------------------------------------------------------------
-- Hack to get the typechecker to accept our action functions
happyTcHack :: Happy_GHC_Exts.Int# -> a -> a
happyTcHack x y = y
{-# INLINE happyTcHack #-}
-----------------------------------------------------------------------------
-- Seq-ing. If the --strict flag is given, then Happy emits
-- happySeq = happyDoSeq
-- otherwise it emits
-- happySeq = happyDontSeq
happyDoSeq, happyDontSeq :: a -> b -> b
happyDoSeq a b = a `seq` b
happyDontSeq a b = b
-----------------------------------------------------------------------------
-- Don't inline any functions from the template. GHC has a nasty habit
-- of deciding to inline happyGoto everywhere, which increases the size of
-- the generated parser quite a bit.
{-# NOINLINE happyDoAction #-}
{-# NOINLINE happyTable #-}
{-# NOINLINE happyCheck #-}
{-# NOINLINE happyActOffsets #-}
{-# NOINLINE happyGotoOffsets #-}
{-# NOINLINE happyDefActions #-}
{-# NOINLINE happyShift #-}
{-# NOINLINE happySpecReduce_0 #-}
{-# NOINLINE happySpecReduce_1 #-}
{-# NOINLINE happySpecReduce_2 #-}
{-# NOINLINE happySpecReduce_3 #-}
{-# NOINLINE happyReduce #-}
{-# NOINLINE happyMonadReduce #-}
{-# NOINLINE happyGoto #-}
{-# NOINLINE happyFail #-}
-- end of Happy Template.
alex-3.2.5/src/Parser.y.boot 0000755 0000000 0000000 00000014755 07346545000 014041 0 ustar 00 0000000 0000000 {
-- -----------------------------------------------------------------------------
--
-- Parser.y, part of Alex
--
-- (c) Simon Marlow 2003
--
-- -----------------------------------------------------------------------------
{-# OPTIONS_GHC -w #-}
module Parser ( parse, P ) where
import AbsSyn
import Scan
import CharSet
import ParseMonad hiding ( StartCode )
import Data.Char
--import Debug.Trace
}
%tokentype { Token }
%name parse
%monad { P } { (>>=) } { return }
%lexer { lexer } { T _ EOFT }
%token
'.' { T _ (SpecialT '.') }
';' { T _ (SpecialT ';') }
'<' { T _ (SpecialT '<') }
'>' { T _ (SpecialT '>') }
',' { T _ (SpecialT ',') }
'$' { T _ (SpecialT '$') }
'|' { T _ (SpecialT '|') }
'*' { T _ (SpecialT '*') }
'+' { T _ (SpecialT '+') }
'?' { T _ (SpecialT '?') }
'{' { T _ (SpecialT '{') }
'}' { T _ (SpecialT '}') }
'(' { T _ (SpecialT '(') }
')' { T _ (SpecialT ')') }
'#' { T _ (SpecialT '#') }
'~' { T _ (SpecialT '~') }
'-' { T _ (SpecialT '-') }
'[' { T _ (SpecialT '[') }
']' { T _ (SpecialT ']') }
'^' { T _ (SpecialT '^') }
'/' { T _ (SpecialT '/') }
ZERO { T _ ZeroT }
STRING { T _ (StringT $$) }
BIND { T _ (BindT $$) }
ID { T _ (IdT $$) }
CODE { T _ (CodeT _) }
CHAR { T _ (CharT $$) }
SMAC { T _ (SMacT _) }
RMAC { T _ (RMacT $$) }
SMAC_DEF { T _ (SMacDefT $$) }
RMAC_DEF { T _ (RMacDefT $$) }
WRAPPER { T _ WrapperT }
ENCODING { T _ EncodingT }
ACTIONTYPE { T _ ActionTypeT }
TOKENTYPE { T _ TokenTypeT }
TYPECLASS { T _ TypeClassT }
%%
alex :: { (Maybe (AlexPosn,Code), [Directive], Scanner, Maybe (AlexPosn,Code)) }
: maybe_code directives macdefs scanner maybe_code { ($1,$2,$4,$5) }
maybe_code :: { Maybe (AlexPosn,Code) }
: CODE { case $1 of T pos (CodeT code) ->
Just (pos,code) }
| {- empty -} { Nothing }
directives :: { [Directive] }
: directive directives { $1 : $2 }
| {- empty -} { [] }
directive :: { Directive }
: WRAPPER STRING { WrapperDirective $2 }
| ENCODING encoding { EncodingDirective $2 }
| ACTIONTYPE STRING { ActionType $2 }
| TOKENTYPE STRING { TokenType $2 }
| TYPECLASS STRING { TypeClass $2 }
encoding :: { Encoding }
: STRING {% lookupEncoding $1 }
macdefs :: { () }
: macdef macdefs { () }
| {- empty -} { () }
-- hack: the lexer looks for the '=' in a macro definition, because there
-- doesn't seem to be a way to formulate the grammar here to avoid a
-- conflict (it needs LR(2) rather than LR(1) to find the '=' and distinguish
-- an SMAC/RMAC at the beginning of a definition from an SMAC/RMAC that is
-- part of a regexp in the previous definition).
macdef :: { () }
: SMAC_DEF set {% newSMac $1 $2 }
| RMAC_DEF rexp {% newRMac $1 $2 }
scanner :: { Scanner }
: BIND tokendefs { Scanner $1 $2 }
tokendefs :: { [RECtx] }
: tokendef tokendefs { $1 ++ $2 }
| {- empty -} { [] }
tokendef :: { [RECtx] }
: startcodes rule { [ replaceCodes $1 $2 ] }
| startcodes '{' rules '}' { map (replaceCodes $1) $3 }
| rule { [ $1 ] }
rule :: { RECtx }
: context rhs { let (l,e,r) = $1 in
RECtx [] l e r $2 }
rules :: { [RECtx] }
: rule rules { $1 : $2 }
| {- empty -} { [] }
startcodes :: { [(String,StartCode)] }
: '<' startcodes0 '>' { $2 }
startcodes0 :: { [(String,StartCode)] }
: startcode ',' startcodes0 { ($1,0) : $3 }
| startcode { [($1,0)] }
startcode :: { String }
: ZERO { "0" }
| ID { $1 }
rhs :: { Maybe Code }
: CODE { case $1 of T _ (CodeT code) -> Just code }
| ';' { Nothing }
context :: { Maybe CharSet, RExp, RightContext RExp }
: left_ctx rexp right_ctx { (Just $1,$2,$3) }
| rexp right_ctx { (Nothing,$1,$2) }
left_ctx :: { CharSet }
: '^' { charSetSingleton '\n' }
| set '^' { $1 }
right_ctx :: { RightContext RExp }
: '$' { RightContextRExp (Ch (charSetSingleton '\n')) }
| '/' rexp { RightContextRExp $2 }
| '/' CODE { RightContextCode (case $2 of
T _ (CodeT code) -> code) }
| {- empty -} { NoRightContext }
rexp :: { RExp }
: alt '|' rexp { $1 :| $3 }
| alt { $1 }
alt :: { RExp }
: alt term { $1 :%% $2 }
| term { $1 }
term :: { RExp }
: rexp0 rep { $2 $1 }
| rexp0 { $1 }
rep :: { RExp -> RExp }
: '*' { Star }
| '+' { Plus }
| '?' { Ques }
-- TODO: these don't check for digits
-- properly.
| '{' CHAR '}' { repeat_rng (digit $2) Nothing }
| '{' CHAR ',' '}' { repeat_rng (digit $2) (Just Nothing) }
| '{' CHAR ',' CHAR '}' { repeat_rng (digit $2) (Just (Just (digit $4))) }
rexp0 :: { RExp }
: '(' ')' { Eps }
| STRING { foldr (:%%) Eps
(map (Ch . charSetSingleton) $1) }
| RMAC {% lookupRMac $1 }
| set { Ch $1 }
| '(' rexp ')' { $2 }
set :: { CharSet }
: set '#' set0 { $1 `charSetMinus` $3 }
| set0 { $1 }
set0 :: { CharSet }
: CHAR { charSetSingleton $1 }
| CHAR '-' CHAR { charSetRange $1 $3 }
| smac {% lookupSMac $1 }
| '[' sets ']' { foldr charSetUnion emptyCharSet $2 }
-- [^sets] is the same as '. # [sets]'
-- The upshot is that [^set] does *not* match a newline character,
-- which seems much more useful than just taking the complement.
| '[' '^' sets ']'
{% do { dot <- lookupSMac (tokPosn $1, ".");
return (dot `charSetMinus`
foldr charSetUnion emptyCharSet $3) }}
-- ~set is the same as '. # set'
| '~' set0 {% do { dot <- lookupSMac (tokPosn $1, ".");
return (dot `charSetMinus` $2) } }
sets :: { [CharSet] }
: set sets { $1 : $2 }
| {- empty -} { [] }
smac :: { (AlexPosn,String) }
: '.' { (tokPosn $1, ".") }
| SMAC { case $1 of T p (SMacT s) -> (p, s) }
{
happyError :: P a
happyError = failP "parse error"
-- -----------------------------------------------------------------------------
-- Utils
digit c = ord c - ord '0'
repeat_rng :: Int -> Maybe (Maybe Int) -> (RExp->RExp)
repeat_rng n (Nothing) re = foldr (:%%) Eps (replicate n re)
repeat_rng n (Just Nothing) re = foldr (:%%) (Star re) (replicate n re)
repeat_rng n (Just (Just m)) re = intl :%% rst
where
intl = repeat_rng n Nothing re
rst = foldr (\re re'->Ques(re :%% re')) Eps (replicate (m-n) re)
replaceCodes codes rectx = rectx{ reCtxStartCodes = codes }
lookupEncoding :: String -> P Encoding
lookupEncoding s = case map toLower s of
"iso-8859-1" -> return Latin1
"latin1" -> return Latin1
"utf-8" -> return UTF8
"utf8" -> return UTF8
_ -> failP ("encoding " ++ show s ++ " not supported")
}
alex-3.2.5/src/Scan.hs 0000644 0000000 0000000 00000504340 07346545000 012660 0 ustar 00 0000000 0000000 {-# OPTIONS_GHC -fno-warn-unused-binds -fno-warn-missing-signatures #-}
{-# LANGUAGE CPP,MagicHash #-}
{-# LINE 13 "src/Scan.x" #-}
module Scan (lexer, AlexPosn(..), Token(..), Tkn(..), tokPosn) where
import Data.Char
import ParseMonad
--import Debug.Trace
#if __GLASGOW_HASKELL__ >= 603
#include "ghcconfig.h"
#elif defined(__GLASGOW_HASKELL__)
#include "config.h"
#endif
#if __GLASGOW_HASKELL__ >= 503
import Data.Array
import Data.Array.Base (unsafeAt)
#else
import Array
#endif
#if __GLASGOW_HASKELL__ >= 503
import GHC.Exts
#else
import GlaExts
#endif
alex_tab_size :: Int
alex_tab_size = 8
alex_base :: AlexAddr
alex_base = AlexA#
"\xf8\xff\xff\xff\x72\x00\x00\x00\xe5\x00\x00\x00\xd9\xff\xff\xff\xda\xff\xff\xff\xea\x00\x00\x00\xa1\xff\xff\xff\x96\xff\xff\xff\xc3\x01\x00\x00\x17\x02\x00\x00\x77\x00\x00\x00\x15\x02\x00\x00\x7c\x00\x00\x00\x0b\x03\x00\x00\x8b\x03\x00\x00\x0b\x04\x00\x00\xa5\xff\xff\xff\x9c\xff\xff\xff\xa7\xff\xff\xff\x9a\xff\xff\xff\x8b\x04\x00\x00\x0b\x05\x00\x00\xe8\x02\x00\x00\xca\x05\x00\x00\xe1\xff\xff\xff\xe2\xff\xff\xff\xc5\x05\x00\x00\x45\x06\x00\x00\xc5\x06\x00\x00\x45\x07\x00\x00\xe3\xff\xff\xff\xa8\xff\xff\xff\xa9\xff\xff\xff\xaa\xff\xff\xff\xb2\xff\xff\xff\xc5\x07\x00\x00\x45\x08\x00\x00\xc5\x08\x00\x00\x45\x09\x00\x00\xc5\x09\x00\x00\x45\x0a\x00\x00\xc5\x0a\x00\x00\x1b\x00\x00\x00\x45\x0b\x00\x00\xc5\x0b\x00\x00\x45\x0c\x00\x00\xb3\xff\xff\xff\x09\x00\x00\x00\xb4\xff\xff\xff\x16\x00\x00\x00\x17\x00\x00\x00\xb1\xff\xff\xff\x1c\x00\x00\x00\x1d\x00\x00\x00\x1e\x00\x00\x00\x2c\x00\x00\x00\x00\x00\x00\x00\xb6\x0c\x00\x00\x00\x00\x00\x00\x27\x0d\x00\x00\x00\x00\x00\x00\x98\x0d\x00\x00\x22\x00\x00\x00\x1f\x00\x00\x00\x2b\x00\x00\x00\x25\x00\x00\x00\x00\x00\x00\x00\xd9\x0d\x00\x00\x00\x00\x00\x00\x4a\x0e\x00\x00\x00\x00\x00\x00\xbb\x0e\x00\x00\x00\x00\x00\x00\xfc\x0e\x00\x00\x00\x00\x00\x00\x3d\x0f\x00\x00\x00\x00\x00\x00\xae\x0f\x00\x00\x00\x00\x00\x00\x1f\x10\x00\x00\x26\x00\x00\x00\x1f\x11\x00\x00\xdf\x10\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x20\x11\x00\x00\x00\x00\x00\x00\x91\x11\x00\x00\x00\x00\x00\x00\x02\x12\x00\x00\x00\x00\x00\x00\x43\x12\x00\x00\x43\x13\x00\x00\x03\x13\x00\x00\x00\x00\x00\x00\x03\x14\x00\x00\xc3\x13\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x04\x14\x00\x00\x00\x00\x00\x00\x45\x14\x00\x00\x2a\x00\x00\x00\x3b\x15\x00\x00\x32\x16\x00\x00\x3d\x15\x00\x00\xad\x16\x00\x00\xb8\x14\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x6e\x16\x00\x00\x00\x00\x00\x00\xaf\x16\x00\x00\xaf\x17\x00\x00\xff\x17\x00\x00\x79\x17\x00\x00\x00\x00\x00\x00\xff\x18\x00\x00\xbf\x18\x00\x00\x00\x00\x00\x00\xbf\x19\x00\x00\x7f\x19\x00\x00\x00\x00\x00\x00\x2d\x00\x00\x00\x2f\x00\x00\x00\x27\x00\x00\x00\x75\x1a\x00\x00\x75\x1b\x00\x00\xd8\x19\x00\x00\x00\x00\x00\x00\xf5\x1b\x00\x00\x45\x1c\x00\x00\xbf\x1b\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xec\x00\x00\x00\x3b\x1d\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x31\x1e\x00\x00\x00\x00\x00\x00\x6d\x00\x00\x00\x05\x1d\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x74\x00\x00\x00\x51\x1d\x00\x00\xca\x00\x00\x00\x00\x00\x00\x00\xe8\x00\x00\x00\x0c\x1e\x00\x00\x71\x00\x00\x00\x32\x1e\x00\x00\x00\x00\x00\x00\x28\x1f\x00\x00\x83\x1f\x00\x00\xf9\x1f\x00\x00\xf0\x20\x00\x00\xfc\x1f\x00\x00\x62\x21\x00\x00\x01\x20\x00\x00\x80\x21\x00\x00\x00\x00\x00\x00\x76\x22\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x73\x22\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00"#
alex_table :: AlexAddr
alex_table = AlexA#
"\x00\x00\x87\x00\x87\x00\x87\x00\x87\x00\x87\x00\x88\x00\x8a\x00\x91\x00\x94\x00\x7d\x00\x7c\x00\x7b\x00\x07\x00\x0b\x00\x0d\x00\x67\x00\x66\x00\x11\x00\x13\x00\x41\x00\x12\x00\x31\x00\x3f\x00\x87\x00\x9d\x00\xa0\x00\x8f\x00\x8e\x00\x37\x00\x9d\x00\x9d\x00\x8f\x00\x8f\x00\x8f\x00\x8f\x00\x8f\x00\x8d\x00\x8f\x00\x8f\x00\x9d\x00\x9d\x00\x9d\x00\x9d\x00\x9d\x00\x9d\x00\x9d\x00\x9d\x00\x9d\x00\x9d\x00\x9b\x00\x8f\x00\xa9\x00\x9d\x00\x9d\x00\x8f\x00\x9c\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x8f\x00\x9f\x00\x8f\x00\x8f\x00\x9d\x00\x9d\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x9e\x00\x8c\x00\x8f\x00\x8f\x00\x8f\x00\x9d\x00\x40\x00\x3e\x00\x2f\x00\x87\x00\x87\x00\x87\x00\x87\x00\x87\x00\x0a\x00\x0a\x00\x0a\x00\x0a\x00\x0a\x00\x0c\x00\x0c\x00\x0c\x00\x0c\x00\x0c\x00\x21\x00\x32\x00\x33\x00\x30\x00\x22\x00\x1f\x00\x2e\x00\x35\x00\x87\x00\x20\x00\x34\x00\x50\x00\x10\x00\x0a\x00\x06\x00\x90\x00\x88\x00\x93\x00\x0c\x00\x92\x00\x8a\x00\x03\x00\x2a\x00\x00\x00\x00\x00\x36\x00\x95\x00\x95\x00\x95\x00\x95\x00\x95\x00\x95\x00\x95\x00\x95\x00\x95\x00\x95\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa6\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa8\x00\x75\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x76\x00\x15\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x63\x00\x24\x00\x4c\x00\x4c\x00\x4c\x00\x4d\x00\x72\x00\x87\x00\x87\x00\x87\x00\x87\x00\x87\x00\x00\x00\xff\xff\x87\x00\x87\x00\x87\x00\x87\x00\x87\x00\x97\x00\x97\x00\x97\x00\x97\x00\x97\x00\x97\x00\x97\x00\x97\x00\x00\x00\x00\x00\x00\x00\x87\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x87\x00\x00\x00\x00\x00\x00\x00\x00\x00\xac\x00\x03\x00\x00\x00\x00\x00\xaa\x00\x00\x00\x8b\x00\x97\x00\x97\x00\x97\x00\x97\x00\x97\x00\x97\x00\x97\x00\x97\x00\x00\x00\x00\x00\x00\x00\xad\x00\x05\x00\x00\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x51\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x52\x00\x29\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x43\x00\x2d\x00\x38\x00\x38\x00\x38\x00\x39\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x08\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x00\x00\xff\xff\x00\x00\x00\x00\x08\x00\x00\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x09\x00\x00\x00\xa2\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x00\x00\xa5\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x00\x00\x00\x00\x00\x00\x00\x00\x09\x00\x00\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x00\x00\x00\x00\xa4\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x5f\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x60\x00\x25\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4b\x00\x2b\x00\x3c\x00\x3c\x00\x3c\x00\x3d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa7\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x08\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x5c\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5d\x00\x26\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x49\x00\x2c\x00\x3a\x00\x3a\x00\x3a\x00\x3b\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x82\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x7f\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x78\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x75\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x77\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x09\x00\x71\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x0e\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x0f\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6a\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x14\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x15\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x62\x00\x5f\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x5c\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x1a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x1d\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x51\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x25\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x26\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x29\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x43\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x49\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x4b\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x52\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x55\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x5b\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x5d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x60\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x63\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x65\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x6b\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x6e\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x70\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x73\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x76\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x79\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x69\x00\x69\x00\x69\x00\x69\x00\x69\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x69\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x05\x00\x00\x00\x04\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x51\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x52\x00\x29\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x43\x00\x2d\x00\x38\x00\x38\x00\x38\x00\x39\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x69\x00\x69\x00\x69\x00\x69\x00\x69\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x69\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x68\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1e\x00\x00\x00\x00\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x04\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x00\x00\x00\x00\x00\x00\x00\x00\x68\x00\x00\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x80\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x84\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x71\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x74\x00\x73\x00\x1a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5a\x00\x5b\x00\x27\x00\x46\x00\x46\x00\x46\x00\x47\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x89\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x6a\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6b\x00\x1d\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x55\x00\x28\x00\x44\x00\x44\x00\x44\x00\x45\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x82\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x85\x00\x84\x00\x0e\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x6f\x00\x70\x00\x1b\x00\x58\x00\x58\x00\x58\x00\x59\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\x16\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x7f\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x81\x00\x80\x00\x0f\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6d\x00\x6e\x00\x1c\x00\x56\x00\x56\x00\x56\x00\x57\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x05\x00\x00\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\x96\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\x17\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x51\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x53\x00\x52\x00\x29\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x42\x00\x43\x00\x2d\x00\x38\x00\x38\x00\x38\x00\x39\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x69\x00\x69\x00\x69\x00\x69\x00\x69\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x69\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x68\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x1e\x00\x00\x00\x00\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x04\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x00\x00\x00\x00\x00\x00\x00\x00\x68\x00\x00\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x68\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x95\x00\x95\x00\x95\x00\x95\x00\x95\x00\x95\x00\x95\x00\x95\x00\x95\x00\x95\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x99\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x9a\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\x98\x00\xff\xff\x00\x00\x0a\x00\x0a\x00\x0a\x00\x0a\x00\x0a\x00\x0c\x00\x0c\x00\x0c\x00\x0c\x00\x0c\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x89\x00\x0a\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0c\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x18\x00\x00\x00\x00\x00\x00\x00\x00\x00\x19\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa6\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa8\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x78\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x7a\x00\x79\x00\x14\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x64\x00\x65\x00\x23\x00\x4e\x00\x4e\x00\x4e\x00\x4f\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x6a\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6c\x00\x6b\x00\x1d\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x54\x00\x55\x00\x28\x00\x44\x00\x44\x00\x44\x00\x45\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x0a\x00\x0a\x00\x0a\x00\x0a\x00\x0a\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0a\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa1\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x18\x00\x00\x00\x00\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\x00\x00\x00\x00\x00\x00\xa6\x00\x00\x00\x00\x00\x00\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa1\x00\x00\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\xa1\x00\x0c\x00\x0c\x00\x0c\x00\x0c\x00\x0c\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x0c\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa3\x00\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x19\x00\x00\x00\x00\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\x00\x00\x00\x00\x00\x00\xa8\x00\x00\x00\x00\x00\x00\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa5\x00\x00\x00\x00\x00\x00\x00\xa3\x00\x00\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\xa3\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x5f\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x61\x00\x60\x00\x25\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4a\x00\x4b\x00\x2b\x00\x3c\x00\x3c\x00\x3c\x00\x3d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xab\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xa7\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\x00\x00\x00\x00\x00\x00\x00\x00\xab\x00\x00\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\xab\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x5c\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5e\x00\x5d\x00\x26\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x48\x00\x49\x00\x2c\x00\x3a\x00\x3a\x00\x3a\x00\x3b\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff"#
alex_check :: AlexAddr
alex_check = AlexA#
"\xff\xff\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x2d\x00\x2d\x00\x67\x00\x73\x00\x65\x00\x6f\x00\x65\x00\x73\x00\x2d\x00\x2d\x00\x2d\x00\x69\x00\x69\x00\x61\x00\x61\x00\x6b\x00\x65\x00\x63\x00\x20\x00\x21\x00\x22\x00\x23\x00\x24\x00\x25\x00\x26\x00\x27\x00\x28\x00\x29\x00\x2a\x00\x2b\x00\x2c\x00\x2d\x00\x2e\x00\x2f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\x3b\x00\x3c\x00\x3d\x00\x3e\x00\x3f\x00\x40\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\x5b\x00\x5c\x00\x5d\x00\x5e\x00\x5f\x00\x60\x00\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\x7c\x00\x7d\x00\x7e\x00\x7f\x00\x6f\x00\x63\x00\x63\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x6f\x00\x6e\x00\x70\x00\x61\x00\x6c\x00\x64\x00\x72\x00\x65\x00\x20\x00\x74\x00\x79\x00\x70\x00\x70\x00\x20\x00\x6e\x00\x72\x00\x2d\x00\x6e\x00\x20\x00\x6e\x00\x2d\x00\x2d\x00\x74\x00\xff\xff\xff\xff\x77\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x3d\x00\xff\xff\xff\xff\xff\xff\xff\xff\x3d\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\x7b\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\xff\xff\x0a\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\xff\xff\xff\xff\xff\xff\x20\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x20\x00\xff\xff\xff\xff\xff\xff\xff\xff\x2c\x00\x2d\x00\xff\xff\xff\xff\x30\x00\xff\xff\x2d\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\xff\xff\xff\xff\xff\xff\x3e\x00\x3a\x00\xff\xff\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x27\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\xff\xff\x0a\x00\xff\xff\xff\xff\x5f\x00\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x27\x00\xff\xff\x7d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\xff\xff\x3d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\xff\xff\xff\xff\xff\xff\xff\xff\x5f\x00\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\xff\xff\xff\xff\x7d\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x0a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x3d\x00\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x00\x00\x01\x00\x02\x00\x03\x00\x04\x00\x05\x00\x06\x00\x07\x00\x08\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x0e\x00\x0f\x00\x10\x00\x11\x00\x12\x00\x13\x00\x14\x00\x15\x00\x16\x00\x17\x00\x18\x00\x19\x00\x1a\x00\x1b\x00\x1c\x00\x1d\x00\x1e\x00\x1f\x00\x20\x00\x21\x00\x22\x00\x23\x00\x24\x00\x25\x00\x26\x00\x27\x00\x28\x00\x29\x00\x2a\x00\x2b\x00\x2c\x00\x2d\x00\x2e\x00\x2f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\x3b\x00\x3c\x00\x3d\x00\x3e\x00\x3f\x00\x40\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\x5b\x00\x5c\x00\x5d\x00\x5e\x00\x5f\x00\x60\x00\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\x7c\x00\x7d\x00\x7e\x00\x7f\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x00\x00\x01\x00\x02\x00\x03\x00\x04\x00\x05\x00\x06\x00\x07\x00\x08\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x0e\x00\x0f\x00\x10\x00\x11\x00\x12\x00\x13\x00\x14\x00\x15\x00\x16\x00\x17\x00\x18\x00\x19\x00\x1a\x00\x1b\x00\x1c\x00\x1d\x00\x1e\x00\x1f\x00\x20\x00\x21\x00\x22\x00\x23\x00\x24\x00\x25\x00\x26\x00\x27\x00\x28\x00\x29\x00\x2a\x00\x2b\x00\x2c\x00\x2d\x00\x2e\x00\x2f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\x3b\x00\x3c\x00\x3d\x00\x3e\x00\x3f\x00\x40\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\x5b\x00\x5c\x00\x5d\x00\x5e\x00\x5f\x00\x60\x00\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\x7c\x00\x7d\x00\x7e\x00\x7f\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x00\x00\x01\x00\x02\x00\x03\x00\x04\x00\x05\x00\x06\x00\x07\x00\x08\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x0e\x00\x0f\x00\x10\x00\x11\x00\x12\x00\x13\x00\x14\x00\x15\x00\x16\x00\x17\x00\x18\x00\x19\x00\x1a\x00\x1b\x00\x1c\x00\x1d\x00\x1e\x00\x1f\x00\x20\x00\x21\x00\x22\x00\x23\x00\x24\x00\x25\x00\x26\x00\x27\x00\x28\x00\x29\x00\x2a\x00\x2b\x00\x2c\x00\x2d\x00\x2e\x00\x2f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\x3b\x00\x3c\x00\x3d\x00\x3e\x00\x3f\x00\x40\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\x5b\x00\x5c\x00\x5d\x00\x5e\x00\x5f\x00\x60\x00\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\x7c\x00\x7d\x00\x7e\x00\x7f\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x0a\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x20\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x3a\x00\xff\xff\x3a\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\xff\xff\xff\xff\xff\xff\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x20\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x27\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x2d\x00\xff\xff\xff\xff\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\xff\xff\xff\xff\xff\xff\xff\xff\x5f\x00\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x00\x00\x01\x00\x02\x00\x03\x00\x04\x00\x05\x00\x06\x00\x07\x00\x08\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x0e\x00\x0f\x00\x10\x00\x11\x00\x12\x00\x13\x00\x14\x00\x15\x00\x16\x00\x17\x00\x18\x00\x19\x00\x1a\x00\x1b\x00\x1c\x00\x1d\x00\x1e\x00\x1f\x00\x20\x00\x21\x00\x22\x00\x23\x00\x24\x00\x25\x00\x26\x00\x27\x00\x28\x00\x29\x00\x2a\x00\x2b\x00\x2c\x00\x2d\x00\x2e\x00\x2f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\x3b\x00\x3c\x00\x3d\x00\x3e\x00\x3f\x00\x40\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\x5b\x00\x5c\x00\x5d\x00\x5e\x00\x5f\x00\x60\x00\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\x7c\x00\x7d\x00\x7e\x00\x7f\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x00\x00\x01\x00\x02\x00\x03\x00\x04\x00\x05\x00\x06\x00\x07\x00\x08\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x0e\x00\x0f\x00\x10\x00\x11\x00\x12\x00\x13\x00\x14\x00\x15\x00\x16\x00\x17\x00\x18\x00\x19\x00\x1a\x00\x1b\x00\x1c\x00\x1d\x00\x1e\x00\x1f\x00\x20\x00\x21\x00\x22\x00\x23\x00\x24\x00\x25\x00\x26\x00\x27\x00\x28\x00\x29\x00\x2a\x00\x2b\x00\x2c\x00\x2d\x00\x2e\x00\x2f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\x3b\x00\x3c\x00\x3d\x00\x3e\x00\x3f\x00\x40\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\x5b\x00\x5c\x00\x5d\x00\x5e\x00\x5f\x00\x60\x00\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\x7c\x00\x7d\x00\x7e\x00\x7f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x00\x00\x01\x00\x02\x00\x03\x00\x04\x00\x05\x00\x06\x00\x07\x00\x08\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x0e\x00\x0f\x00\x10\x00\x11\x00\x12\x00\x13\x00\x14\x00\x15\x00\x16\x00\x17\x00\x18\x00\x19\x00\x1a\x00\x1b\x00\x1c\x00\x1d\x00\x1e\x00\x1f\x00\x20\x00\x21\x00\x22\x00\x23\x00\x24\x00\x25\x00\x26\x00\x27\x00\x28\x00\x29\x00\x2a\x00\x2b\x00\x2c\x00\x2d\x00\x2e\x00\x2f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\x3b\x00\x3c\x00\x3d\x00\x3e\x00\x3f\x00\x40\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\x5b\x00\x5c\x00\x5d\x00\x5e\x00\x5f\x00\x60\x00\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\x7c\x00\x7d\x00\x7e\x00\x7f\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x00\x00\x01\x00\x02\x00\x03\x00\x04\x00\x05\x00\x06\x00\x07\x00\x08\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x0e\x00\x0f\x00\x10\x00\x11\x00\x12\x00\x13\x00\x14\x00\x15\x00\x16\x00\x17\x00\x18\x00\x19\x00\x1a\x00\x1b\x00\x1c\x00\x1d\x00\x1e\x00\x1f\x00\x20\x00\x21\x00\x22\x00\x23\x00\x24\x00\x25\x00\x26\x00\x27\x00\x28\x00\x29\x00\x2a\x00\x2b\x00\x2c\x00\x2d\x00\x2e\x00\x2f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\x3b\x00\x3c\x00\x3d\x00\x3e\x00\x3f\x00\x40\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\x5b\x00\x5c\x00\x5d\x00\x5e\x00\x5f\x00\x60\x00\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\x7c\x00\x7d\x00\x7e\x00\x7f\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x0a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x22\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x00\x00\x01\x00\x02\x00\x03\x00\x04\x00\x05\x00\x06\x00\x07\x00\x08\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x0e\x00\x0f\x00\x10\x00\x11\x00\x12\x00\x13\x00\x14\x00\x15\x00\x16\x00\x17\x00\x18\x00\x19\x00\x1a\x00\x1b\x00\x1c\x00\x1d\x00\x1e\x00\x1f\x00\x20\x00\x21\x00\x22\x00\x23\x00\x24\x00\x25\x00\x26\x00\x27\x00\x28\x00\x29\x00\x2a\x00\x2b\x00\x2c\x00\x2d\x00\x2e\x00\x2f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\x3b\x00\x3c\x00\x3d\x00\x3e\x00\x3f\x00\x40\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\x5b\x00\x5c\x00\x5d\x00\x5e\x00\x5f\x00\x60\x00\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\x7c\x00\x7d\x00\x7e\x00\x7f\x00\x00\x00\x01\x00\x02\x00\x03\x00\x04\x00\x05\x00\x06\x00\x07\x00\x08\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x0e\x00\x0f\x00\x10\x00\x11\x00\x12\x00\x13\x00\x14\x00\x15\x00\x16\x00\x17\x00\x18\x00\x19\x00\x1a\x00\x1b\x00\x1c\x00\x1d\x00\x1e\x00\x1f\x00\x20\x00\x21\x00\x22\x00\x23\x00\x24\x00\x25\x00\x26\x00\x27\x00\x28\x00\x29\x00\x2a\x00\x2b\x00\x2c\x00\x2d\x00\x2e\x00\x2f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\x3b\x00\x3c\x00\x3d\x00\x3e\x00\x3f\x00\x40\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\x5b\x00\x5c\x00\x5d\x00\x5e\x00\x5f\x00\x60\x00\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\x7c\x00\x7d\x00\x7e\x00\x7f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x0a\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\xff\xff\xff\xff\xff\xff\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x0a\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x3a\x00\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\xff\xff\xff\xff\xff\xff\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x20\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x27\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x2d\x00\xff\xff\xff\xff\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\xff\xff\xff\xff\xff\xff\xff\xff\x5f\x00\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x20\x00\x21\x00\x22\x00\x23\x00\x24\x00\x25\x00\x26\x00\x27\x00\x28\x00\x29\x00\x2a\x00\x2b\x00\x2c\x00\x2d\x00\x2e\x00\x2f\x00\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\x3a\x00\x3b\x00\x3c\x00\x3d\x00\x3e\x00\x3f\x00\x40\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\x5b\x00\x5c\x00\x5d\x00\x5e\x00\x5f\x00\x60\x00\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x7b\x00\x7c\x00\x7d\x00\x7e\x00\x7f\x00\x0a\x00\xff\xff\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x22\x00\x20\x00\xff\xff\xff\xff\xff\xff\xff\xff\x20\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x2d\x00\xff\xff\xff\xff\xff\xff\xff\xff\x2d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x3d\x00\xff\xff\xff\xff\xff\xff\xff\xff\x3d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xff\xff\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x20\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x27\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x2d\x00\xff\xff\xff\xff\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\xff\xff\xff\xff\xff\xff\x3d\x00\xff\xff\xff\xff\xff\xff\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\xff\xff\xff\xff\xff\xff\xff\xff\x5f\x00\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\x09\x00\x0a\x00\x0b\x00\x0c\x00\x0d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x20\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x27\x00\x0a\x00\xff\xff\xff\xff\xff\xff\xff\xff\x2d\x00\xff\xff\xff\xff\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\xff\xff\xff\xff\xff\xff\x3d\x00\xff\xff\xff\xff\xff\xff\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\x3d\x00\xff\xff\xff\xff\xff\xff\x5f\x00\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00\x0a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x27\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x30\x00\x31\x00\x32\x00\x33\x00\x34\x00\x35\x00\x36\x00\x37\x00\x38\x00\x39\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x3d\x00\x41\x00\x42\x00\x43\x00\x44\x00\x45\x00\x46\x00\x47\x00\x48\x00\x49\x00\x4a\x00\x4b\x00\x4c\x00\x4d\x00\x4e\x00\x4f\x00\x50\x00\x51\x00\x52\x00\x53\x00\x54\x00\x55\x00\x56\x00\x57\x00\x58\x00\x59\x00\x5a\x00\xff\xff\xff\xff\xff\xff\xff\xff\x5f\x00\xff\xff\x61\x00\x62\x00\x63\x00\x64\x00\x65\x00\x66\x00\x67\x00\x68\x00\x69\x00\x6a\x00\x6b\x00\x6c\x00\x6d\x00\x6e\x00\x6f\x00\x70\x00\x71\x00\x72\x00\x73\x00\x74\x00\x75\x00\x76\x00\x77\x00\x78\x00\x79\x00\x7a\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x80\x00\x81\x00\x82\x00\x83\x00\x84\x00\x85\x00\x86\x00\x87\x00\x88\x00\x89\x00\x8a\x00\x8b\x00\x8c\x00\x8d\x00\x8e\x00\x8f\x00\x90\x00\x91\x00\x92\x00\x93\x00\x94\x00\x95\x00\x96\x00\x97\x00\x98\x00\x99\x00\x9a\x00\x9b\x00\x9c\x00\x9d\x00\x9e\x00\x9f\x00\xa0\x00\xa1\x00\xa2\x00\xa3\x00\xa4\x00\xa5\x00\xa6\x00\xa7\x00\xa8\x00\xa9\x00\xaa\x00\xab\x00\xac\x00\xad\x00\xae\x00\xaf\x00\xb0\x00\xb1\x00\xb2\x00\xb3\x00\xb4\x00\xb5\x00\xb6\x00\xb7\x00\xb8\x00\xb9\x00\xba\x00\xbb\x00\xbc\x00\xbd\x00\xbe\x00\xbf\x00\xc0\x00\xc1\x00\xc2\x00\xc3\x00\xc4\x00\xc5\x00\xc6\x00\xc7\x00\xc8\x00\xc9\x00\xca\x00\xcb\x00\xcc\x00\xcd\x00\xce\x00\xcf\x00\xd0\x00\xd1\x00\xd2\x00\xd3\x00\xd4\x00\xd5\x00\xd6\x00\xd7\x00\xd8\x00\xd9\x00\xda\x00\xdb\x00\xdc\x00\xdd\x00\xde\x00\xdf\x00\xe0\x00\xe1\x00\xe2\x00\xe3\x00\xe4\x00\xe5\x00\xe6\x00\xe7\x00\xe8\x00\xe9\x00\xea\x00\xeb\x00\xec\x00\xed\x00\xee\x00\xef\x00\xf0\x00\xf1\x00\xf2\x00\xf3\x00\xf4\x00\xf5\x00\xf6\x00\xf7\x00\xf8\x00\xf9\x00\xfa\x00\xfb\x00\xfc\x00\xfd\x00\xfe\x00\xff\x00"#
alex_deflt :: AlexAddr
alex_deflt = AlexA#
"\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x67\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x0b\x00\xff\xff\x0d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x42\x00\x42\x00\x48\x00\x48\x00\x4a\x00\x4a\x00\xff\xff\xff\xff\xff\xff\xff\xff\x53\x00\x53\x00\x54\x00\x54\x00\x5a\x00\x5a\x00\x5e\x00\x5e\x00\x61\x00\x61\x00\x62\x00\x62\x00\x64\x00\x64\x00\xff\xff\x67\x00\x67\x00\x67\x00\x6c\x00\x6c\x00\x6d\x00\x6d\x00\x6f\x00\x6f\x00\x74\x00\x74\x00\x0d\x00\x0d\x00\x0d\x00\x0b\x00\x0b\x00\x0b\x00\x77\x00\x77\x00\x7a\x00\x7a\x00\xff\xff\x67\x00\xff\xff\xff\xff\x7e\x00\x7e\x00\x7e\x00\x81\x00\x81\x00\x85\x00\x85\x00\xae\x00\xae\x00\xae\x00\xae\x00\x9d\x00\x9d\x00\x9d\x00\x98\x00\x98\x00\x98\x00\xff\xff\xff\xff\xff\xff\x7e\x00\x88\x00\x88\x00\x88\x00\x86\x00\x86\x00\x86\x00\x86\x00\xff\xff\xff\xff\x88\x00\xff\xff\xff\xff\x67\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\x7e\x00\xff\xff\xff\xff\xff\xff\xff\xff\x0b\x00\xff\xff\x0d\x00\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff\xff"#
alex_accept = listArray (0 :: Int, 174)
[ AlexAccNone
, AlexAcc 41
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccNone
, AlexAccSkip
, AlexAcc 40
, AlexAcc 39
, AlexAcc 38
, AlexAcc 37
, AlexAcc 36
, AlexAccPred 35 (alexRightContext 131)(AlexAcc 34)
, AlexAcc 33
, AlexAcc 32
, AlexAcc 31
, AlexAcc 30
, AlexAcc 29
, AlexAcc 28
, AlexAcc 27
, AlexAcc 26
, AlexAcc 25
, AlexAcc 24
, AlexAcc 23
, AlexAcc 22
, AlexAcc 21
, AlexAcc 20
, AlexAcc 19
, AlexAcc 18
, AlexAcc 17
, AlexAcc 16
, AlexAcc 15
, AlexAcc 14
, AlexAcc 13
, AlexAcc 12
, AlexAcc 11
, AlexAcc 10
, AlexAcc 9
, AlexAcc 8
, AlexAcc 7
, AlexAcc 6
, AlexAcc 5
, AlexAcc 4
, AlexAcc 3
, AlexAcc 2
, AlexAcc 1
, AlexAcc 0
]
alex_actions = array (0 :: Int, 42)
[ (41,alex_action_25)
, (40,alex_action_0)
, (39,alex_action_0)
, (38,alex_action_1)
, (37,alex_action_2)
, (36,alex_action_2)
, (35,alex_action_3)
, (34,alex_action_4)
, (33,alex_action_4)
, (32,alex_action_4)
, (31,alex_action_4)
, (30,alex_action_5)
, (29,alex_action_6)
, (28,alex_action_7)
, (27,alex_action_8)
, (26,alex_action_9)
, (25,alex_action_10)
, (24,alex_action_11)
, (23,alex_action_12)
, (22,alex_action_13)
, (21,alex_action_13)
, (20,alex_action_13)
, (19,alex_action_14)
, (18,alex_action_14)
, (17,alex_action_14)
, (16,alex_action_14)
, (15,alex_action_14)
, (14,alex_action_14)
, (13,alex_action_15)
, (12,alex_action_15)
, (11,alex_action_16)
, (10,alex_action_16)
, (9,alex_action_17)
, (8,alex_action_17)
, (7,alex_action_18)
, (6,alex_action_18)
, (5,alex_action_19)
, (4,alex_action_20)
, (3,alex_action_21)
, (2,alex_action_22)
, (1,alex_action_23)
, (0,alex_action_24)
]
{-# LINE 77 "src/Scan.x" #-}
-- -----------------------------------------------------------------------------
-- Token type
data Token = T AlexPosn Tkn
deriving Show
tokPosn (T p _) = p
data Tkn
= SpecialT Char
| CodeT String
| ZeroT
| IdT String
| StringT String
| BindT String
| CharT Char
| SMacT String
| RMacT String
| SMacDefT String
| RMacDefT String
| NumT Int
| WrapperT
| EncodingT
| ActionTypeT
| TokenTypeT
| TypeClassT
| EOFT
deriving Show
-- -----------------------------------------------------------------------------
-- Token functions
special, zero, string, bind, escape, decch, hexch, octch, char :: Action
smac, rmac, smacdef, rmacdef, startcode, wrapper, encoding :: Action
actionty, tokenty, typeclass :: Action
special (p,_,str) _ = return $ T p (SpecialT (head str))
zero (p,_,_) _ = return $ T p ZeroT
string (p,_,str) ln = return $ T p (StringT (extract ln str))
bind (p,_,str) _ = return $ T p (BindT (takeWhile isIdChar str))
escape (p,_,str) _ = return $ T p (CharT (esc str))
decch (p,_,str) ln = return $ T p (CharT (do_ech 10 ln (take (ln-1) (tail str))))
hexch (p,_,str) ln = return $ T p (CharT (do_ech 16 ln (take (ln-2) (drop 2 str))))
octch (p,_,str) ln = return $ T p (CharT (do_ech 8 ln (take (ln-2) (drop 2 str))))
char (p,_,str) _ = return $ T p (CharT (head str))
smac (p,_,str) ln = return $ T p (SMacT (mac ln str))
rmac (p,_,str) ln = return $ T p (RMacT (mac ln str))
smacdef (p,_,str) ln = return $ T p (SMacDefT (macdef ln str))
rmacdef (p,_,str) ln = return $ T p (RMacDefT (macdef ln str))
startcode (p,_,str) ln = return $ T p (IdT (take ln str))
wrapper (p,_,_) _ = return $ T p WrapperT
encoding (p,_,_) _ = return $ T p EncodingT
actionty (p,_,_) _ = return $ T p ActionTypeT
tokenty (p,_,_) _ = return $ T p TokenTypeT
typeclass (p,_,_) _ = return $ T p TypeClassT
isIdChar :: Char -> Bool
isIdChar c = isAlphaNum c || c `elem` "_'"
extract :: Int -> String -> String
extract ln str = take (ln-2) (tail str)
do_ech :: Int -> Int -> String -> Char
do_ech radix _ln str = chr (parseInt radix str)
mac :: Int -> String -> String
mac ln str = take (ln-1) $ tail str
-- TODO : replace not . isSpace with (\c -> not (isSpace c) && c /= '=')
macdef :: Int -> String -> String
macdef _ln str = takeWhile (\c -> not (isSpace c) && c /= '=') $ tail str
esc :: String -> Char
esc str =
case head $ tail str of
'a' -> '\a'
'b' -> '\b'
'f' -> '\f'
'n' -> '\n'
'r' -> '\r'
't' -> '\t'
'v' -> '\v'
c -> c
parseInt :: Int -> String -> Int
parseInt radix ds = foldl1 (\n d -> n * radix + d) (map digitToInt ds)
-- In brace-delimited code, we have to be careful to match braces
-- within the code, but ignore braces inside strings and character
-- literals. We do an approximate job (doing it properly requires
-- implementing a large chunk of the Haskell lexical syntax).
code :: Action
code (p,_,_inp) _ = do
currentInput <- getInput
go currentInput 1 ""
where
go :: AlexInput -> Int -> String -> P Token
go inp 0 cs = do
setInput inp
return (T p (CodeT (reverse (tail cs))))
go inp n cs = do
case alexGetChar inp of
Nothing -> err inp
Just (c,inp2) ->
case c of
'{' -> go inp2 (n+1) (c:cs)
'}' -> go inp2 (n-1) (c:cs)
'\'' -> go_char inp2 n (c:cs)
'\"' -> go_str inp2 n (c:cs) '\"'
c2 -> go inp2 n (c2:cs)
go_char :: AlexInput -> Int -> String -> P Token
-- try to catch multiple occurrences of ' at identifier end
go_char inp n cs@('\'':'\'':_) = go inp n cs
-- try to catch occurrences of ' within an identifier
go_char inp n cs@('\'':c2:_)
| isAlphaNum c2 = go inp n cs
go_char inp n cs = go_str inp n cs '\''
go_str :: AlexInput -> Int -> String -> Char -> P Token
go_str inp n cs end = do
case alexGetChar inp of
Nothing -> err inp
Just (c,inp2)
| c == end -> go inp2 n (c:cs)
| otherwise ->
case c of
'\\' -> case alexGetChar inp2 of
Nothing -> err inp2
Just (d,inp3) -> go_str inp3 n (d:c:cs) end
c2 -> go_str inp2 n (c2:cs) end
err inp = do setInput inp; lexError "lexical error in code fragment"
lexError :: String -> P a
lexError s = do
(_,_,_,input) <- getInput
failP (s ++ (if (not (null input))
then " at " ++ show (head input)
else " at end of file"))
lexer :: (Token -> P a) -> P a
lexer cont = lexToken >>= cont
lexToken :: P Token
lexToken = do
inp@(p,c,_,s) <- getInput
sc <- getStartCode
case alexScan inp sc of
AlexEOF -> return (T p EOFT)
AlexError _ -> lexError "lexical error"
AlexSkip inp1 _ -> do
setInput inp1
lexToken
AlexToken inp1 len t -> do
setInput inp1
t (p,c,s) len
type Action = (AlexPosn,Char,String) -> Int -> P Token
skip :: Action
skip _ _ = lexToken
andBegin :: Action -> StartCode -> Action
andBegin act sc inp len = setStartCode sc >> act inp len
afterstartcodes,startcodes :: Int
afterstartcodes = 1
startcodes = 2
alex_action_0 = skip
alex_action_1 = string
alex_action_2 = bind
alex_action_3 = code
alex_action_4 = special
alex_action_5 = wrapper
alex_action_6 = encoding
alex_action_7 = actionty
alex_action_8 = tokenty
alex_action_9 = typeclass
alex_action_10 = decch
alex_action_11 = hexch
alex_action_12 = octch
alex_action_13 = escape
alex_action_14 = char
alex_action_15 = smac
alex_action_16 = rmac
alex_action_17 = smacdef
alex_action_18 = rmacdef
alex_action_19 = special `andBegin` startcodes
alex_action_20 = zero
alex_action_21 = startcode
alex_action_22 = special
alex_action_23 = special `andBegin` afterstartcodes
alex_action_24 = special `andBegin` 0
alex_action_25 = skip `andBegin` 0
{-# LINE 1 "templates/GenericTemplate.hs" #-}
-- -----------------------------------------------------------------------------
-- ALEX TEMPLATE
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
-- -----------------------------------------------------------------------------
-- INTERNALS and main scanner engine
-- Do not remove this comment. Required to fix CPP parsing when using GCC and a clang-compiled alex.
#if __GLASGOW_HASKELL__ > 706
#define GTE(n,m) (tagToEnum# (n >=# m))
#define EQ(n,m) (tagToEnum# (n ==# m))
#else
#define GTE(n,m) (n >=# m)
#define EQ(n,m) (n ==# m)
#endif
data AlexAddr = AlexA# Addr#
-- Do not remove this comment. Required to fix CPP parsing when using GCC and a clang-compiled alex.
#if __GLASGOW_HASKELL__ < 503
uncheckedShiftL# = shiftL#
#endif
{-# INLINE alexIndexInt16OffAddr #-}
alexIndexInt16OffAddr (AlexA# arr) off =
#ifdef WORDS_BIGENDIAN
narrow16Int# i
where
i = word2Int# ((high `uncheckedShiftL#` 8#) `or#` low)
high = int2Word# (ord# (indexCharOffAddr# arr (off' +# 1#)))
low = int2Word# (ord# (indexCharOffAddr# arr off'))
off' = off *# 2#
#else
indexInt16OffAddr# arr off
#endif
{-# INLINE alexIndexInt32OffAddr #-}
alexIndexInt32OffAddr (AlexA# arr) off =
#ifdef WORDS_BIGENDIAN
narrow32Int# i
where
i = word2Int# ((b3 `uncheckedShiftL#` 24#) `or#`
(b2 `uncheckedShiftL#` 16#) `or#`
(b1 `uncheckedShiftL#` 8#) `or#` b0)
b3 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 3#)))
b2 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 2#)))
b1 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 1#)))
b0 = int2Word# (ord# (indexCharOffAddr# arr off'))
off' = off *# 4#
#else
indexInt32OffAddr# arr off
#endif
#if __GLASGOW_HASKELL__ < 503
quickIndex arr i = arr ! i
#else
-- GHC >= 503, unsafeAt is available from Data.Array.Base.
quickIndex = unsafeAt
#endif
-- -----------------------------------------------------------------------------
-- Main lexing routines
data AlexReturn a
= AlexEOF
| AlexError !AlexInput
| AlexSkip !AlexInput !Int
| AlexToken !AlexInput !Int a
-- alexScan :: AlexInput -> StartCode -> AlexReturn a
alexScan input__ (I# (sc))
= alexScanUser undefined input__ (I# (sc))
alexScanUser user__ input__ (I# (sc))
= case alex_scan_tkn user__ input__ 0# input__ sc AlexNone of
(AlexNone, input__') ->
case alexGetByte input__ of
Nothing ->
AlexEOF
Just _ ->
AlexError input__'
(AlexLastSkip input__'' len, _) ->
AlexSkip input__'' len
(AlexLastAcc k input__''' len, _) ->
AlexToken input__''' len (alex_actions ! k)
-- Push the input through the DFA, remembering the most recent accepting
-- state it encountered.
alex_scan_tkn user__ orig_input len input__ s last_acc =
input__ `seq` -- strict in the input
let
new_acc = (check_accs (alex_accept `quickIndex` (I# (s))))
in
new_acc `seq`
case alexGetByte input__ of
Nothing -> (new_acc, input__)
Just (c, new_input) ->
case fromIntegral c of { (I# (ord_c)) ->
let
base = alexIndexInt32OffAddr alex_base s
offset = (base +# ord_c)
check = alexIndexInt16OffAddr alex_check offset
new_s = if GTE(offset,0#) && EQ(check,ord_c)
then alexIndexInt16OffAddr alex_table offset
else alexIndexInt16OffAddr alex_deflt s
in
case new_s of
-1# -> (new_acc, input__)
-- on an error, we want to keep the input *before* the
-- character that failed, not after.
_ -> alex_scan_tkn user__ orig_input (if c < 0x80 || c >= 0xC0 then (len +# 1#) else len)
-- note that the length is increased ONLY if this is the 1st byte in a char encoding)
new_input new_s new_acc
}
where
check_accs (AlexAccNone) = last_acc
check_accs (AlexAcc a ) = AlexLastAcc a input__ (I# (len))
check_accs (AlexAccSkip) = AlexLastSkip input__ (I# (len))
check_accs (AlexAccPred a predx rest)
| predx user__ orig_input (I# (len)) input__
= AlexLastAcc a input__ (I# (len))
| otherwise
= check_accs rest
check_accs (AlexAccSkipPred predx rest)
| predx user__ orig_input (I# (len)) input__
= AlexLastSkip input__ (I# (len))
| otherwise
= check_accs rest
data AlexLastAcc
= AlexNone
| AlexLastAcc !Int !AlexInput !Int
| AlexLastSkip !AlexInput !Int
data AlexAcc user
= AlexAccNone
| AlexAcc Int
| AlexAccSkip
| AlexAccPred Int (AlexAccPred user) (AlexAcc user)
| AlexAccSkipPred (AlexAccPred user) (AlexAcc user)
type AlexAccPred user = user -> AlexInput -> Int -> AlexInput -> Bool
-- -----------------------------------------------------------------------------
-- Predicates on a rule
alexAndPred p1 p2 user__ in1 len in2
= p1 user__ in1 len in2 && p2 user__ in1 len in2
--alexPrevCharIsPred :: Char -> AlexAccPred _
alexPrevCharIs c _ input__ _ _ = c == alexInputPrevChar input__
alexPrevCharMatches f _ input__ _ _ = f (alexInputPrevChar input__)
--alexPrevCharIsOneOfPred :: Array Char Bool -> AlexAccPred _
alexPrevCharIsOneOf arr _ input__ _ _ = arr ! alexInputPrevChar input__
--alexRightContext :: Int -> AlexAccPred _
alexRightContext (I# (sc)) user__ _ _ input__ =
case alex_scan_tkn user__ input__ 0# input__ sc AlexNone of
(AlexNone, _) -> False
_ -> True
-- TODO: there's no need to find the longest
-- match when checking the right context, just
-- the first match will do.
alex-3.2.5/src/Scan.x.boot 0000755 0000000 0000000 00000017260 07346545000 013462 0 ustar 00 0000000 0000000 -------------------------------------------------------------------------------
-- ALEX SCANNER AND LITERATE PREPROCESSOR
--
-- This Script defines the grammar used to generate the Alex scanner and a
-- preprocessing scanner for dealing with literate scripts. The actions for
-- the Alex scanner are given separately in the Alex module.
--
-- See the Alex manual for a discussion of the scanners defined here.
--
-- Chris Dornan, Aug-95, 4-Jun-96, 10-Jul-96, 29-Sep-97
-------------------------------------------------------------------------------
{
module Scan (lexer, AlexPosn(..), Token(..), Tkn(..), tokPosn) where
import Data.Char
import ParseMonad
--import Debug.Trace
}
$digit = 0-9
$hexdig = [0-9 A-F a-f]
$octal = 0-7
$lower = a-z
$upper = A-Z
$alpha = [$upper $lower]
$alphanum = [$alpha $digit]
$idchar = [$alphanum \_ \']
$special = [\.\;\,\$\|\*\+\?\#\~\-\{\}\(\)\[\]\^\/]
$graphic = $printable # $white
$nonspecial = $graphic # [$special \%]
@id = $alpha $idchar*
@smac = \$ @id | \$ \{ @id \}
@rmac = \@ @id | \@ \{ @id \}
@comment = "--".*
@ws = $white+ | @comment
alex :-
@ws { skip } -- white space; ignore
<0> \" [^\"]* \" { string }
<0> (@id @ws?)? \:\- { bind }
<0> \{ / (\n | [^$digit]) { code }
<0> $special { special } -- note: matches {
<0> \% "wrapper" { wrapper }
<0> \% "encoding" { encoding }
<0> \% "action" { actionty }
<0> \% "token" { tokenty }
<0> \% "typeclass" { typeclass }
<0> \\ $digit+ { decch }
<0> \\ x $hexdig+ { hexch }
<0> \\ o $octal+ { octch }
<0> \\ $printable { escape }
<0> $nonspecial # [\<] { char }
<0> @smac { smac }
<0> @rmac { rmac }
<0> @smac @ws? \= { smacdef }
<0> @rmac @ws? \= { rmacdef }
-- identifiers are allowed to be unquoted in startcode lists
<0> \< { special `andBegin` startcodes }
0 { zero }
@id { startcode }
\, { special }
\> { special `andBegin` afterstartcodes }
-- After a <..> startcode sequence, we can have a {...} grouping of rules,
-- so don't try to interpret the opening { as a code block.
\{ (\n | [^$digit ]) { special `andBegin` 0 }
() { skip `andBegin` 0 } -- note: empty pattern
{
-- -----------------------------------------------------------------------------
-- Token type
data Token = T AlexPosn Tkn
deriving Show
tokPosn (T p _) = p
data Tkn
= SpecialT Char
| CodeT String
| ZeroT
| IdT String
| StringT String
| BindT String
| CharT Char
| SMacT String
| RMacT String
| SMacDefT String
| RMacDefT String
| NumT Int
| WrapperT
| EncodingT
| ActionTypeT
| TokenTypeT
| TypeClassT
| EOFT
deriving Show
-- -----------------------------------------------------------------------------
-- Token functions
special, zero, string, bind, escape, decch, hexch, octch, char :: Action
smac, rmac, smacdef, rmacdef, startcode, wrapper, encoding :: Action
actionty, tokenty, typeclass :: Action
special (p,_,str) _ = return $ T p (SpecialT (head str))
zero (p,_,_) _ = return $ T p ZeroT
string (p,_,str) ln = return $ T p (StringT (extract ln str))
bind (p,_,str) _ = return $ T p (BindT (takeWhile isIdChar str))
escape (p,_,str) _ = return $ T p (CharT (esc str))
decch (p,_,str) ln = return $ T p (CharT (do_ech 10 ln (take (ln-1) (tail str))))
hexch (p,_,str) ln = return $ T p (CharT (do_ech 16 ln (take (ln-2) (drop 2 str))))
octch (p,_,str) ln = return $ T p (CharT (do_ech 8 ln (take (ln-2) (drop 2 str))))
char (p,_,str) _ = return $ T p (CharT (head str))
smac (p,_,str) ln = return $ T p (SMacT (mac ln str))
rmac (p,_,str) ln = return $ T p (RMacT (mac ln str))
smacdef (p,_,str) ln = return $ T p (SMacDefT (macdef ln str))
rmacdef (p,_,str) ln = return $ T p (RMacDefT (macdef ln str))
startcode (p,_,str) ln = return $ T p (IdT (take ln str))
wrapper (p,_,_) _ = return $ T p WrapperT
encoding (p,_,_) _ = return $ T p EncodingT
actionty (p,_,_) _ = return $ T p ActionTypeT
tokenty (p,_,_) _ = return $ T p TokenTypeT
typeclass (p,_,_) _ = return $ T p TypeClassT
isIdChar :: Char -> Bool
isIdChar c = isAlphaNum c || c `elem` "_'"
extract :: Int -> String -> String
extract ln str = take (ln-2) (tail str)
do_ech :: Int -> Int -> String -> Char
do_ech radix _ln str = chr (parseInt radix str)
mac :: Int -> String -> String
mac ln str = take (ln-1) $ tail str
-- TODO : replace not . isSpace with (\c -> not (isSpace c) && c /= '=')
macdef :: Int -> String -> String
macdef _ln str = takeWhile (\c -> not (isSpace c) && c /= '=') $ tail str
esc :: String -> Char
esc str =
case head $ tail str of
'a' -> '\a'
'b' -> '\b'
'f' -> '\f'
'n' -> '\n'
'r' -> '\r'
't' -> '\t'
'v' -> '\v'
c -> c
parseInt :: Int -> String -> Int
parseInt radix ds = foldl1 (\n d -> n * radix + d) (map digitToInt ds)
-- In brace-delimited code, we have to be careful to match braces
-- within the code, but ignore braces inside strings and character
-- literals. We do an approximate job (doing it properly requires
-- implementing a large chunk of the Haskell lexical syntax).
code :: Action
code (p,_,_inp) _ = do
currentInput <- getInput
go currentInput 1 ""
where
go :: AlexInput -> Int -> String -> P Token
go inp 0 cs = do
setInput inp
return (T p (CodeT (reverse (tail cs))))
go inp n cs = do
case alexGetChar inp of
Nothing -> err inp
Just (c,inp2) ->
case c of
'{' -> go inp2 (n+1) (c:cs)
'}' -> go inp2 (n-1) (c:cs)
'\'' -> go_char inp2 n (c:cs)
'\"' -> go_str inp2 n (c:cs) '\"'
c2 -> go inp2 n (c2:cs)
go_char :: AlexInput -> Int -> String -> P Token
-- try to catch multiple occurrences of ' at identifier end
go_char inp n cs@('\'':'\'':_) = go inp n cs
-- try to catch occurrences of ' within an identifier
go_char inp n cs@('\'':c2:_)
| isAlphaNum c2 = go inp n cs
go_char inp n cs = go_str inp n cs '\''
go_str :: AlexInput -> Int -> String -> Char -> P Token
go_str inp n cs end = do
case alexGetChar inp of
Nothing -> err inp
Just (c,inp2)
| c == end -> go inp2 n (c:cs)
| otherwise ->
case c of
'\\' -> case alexGetChar inp2 of
Nothing -> err inp2
Just (d,inp3) -> go_str inp3 n (d:c:cs) end
c2 -> go_str inp2 n (c2:cs) end
err inp = do setInput inp; lexError "lexical error in code fragment"
lexError :: String -> P a
lexError s = do
(_,_,_,input) <- getInput
failP (s ++ (if (not (null input))
then " at " ++ show (head input)
else " at end of file"))
lexer :: (Token -> P a) -> P a
lexer cont = lexToken >>= cont
lexToken :: P Token
lexToken = do
inp@(p,c,_,s) <- getInput
sc <- getStartCode
case alexScan inp sc of
AlexEOF -> return (T p EOFT)
AlexError _ -> lexError "lexical error"
AlexSkip inp1 _ -> do
setInput inp1
lexToken
AlexToken inp1 len t -> do
setInput inp1
t (p,c,s) len
type Action = (AlexPosn,Char,String) -> Int -> P Token
skip :: Action
skip _ _ = lexToken
andBegin :: Action -> StartCode -> Action
andBegin act sc inp len = setStartCode sc >> act inp len
}
alex-3.2.5/src/Set.hs 0000644 0000000 0000000 00000000467 07346545000 012530 0 ustar 00 0000000 0000000 {-# LANGUAGE CPP #-}
module Set ( Set, member, empty, insert ) where
import Data.Set
#if defined(__GLASGOW_HASKELL__) && __GLASGOW_HASKELL__ < 603
member :: Ord a => a -> Set a -> Bool
member = elementOf
empty :: Set a
empty = emptySet
insert :: Ord a => a -> Set a -> Set a
insert = flip addToSet
#endif
alex-3.2.5/src/Sort.hs 0000644 0000000 0000000 00000004630 07346545000 012720 0 ustar 00 0000000 0000000 {------------------------------------------------------------------------------
SORTING LISTS
This module provides properly parameterised insertion and merge sort functions,
complete with associated functions for inserting and merging. `isort' is the
standard lazy version and can be used to the minimum k elements of a list in
linear time. The merge sort is based on a Bob Buckley's (Bob Buckley
18-AUG-95) coding of Knuth's natural merge sort (see Vol. 2). It seems to be
fast in the average case; it makes use of natural runs in the data becomming
linear on ordered data; and it completes in worst time O(n.log(n)). It is
divinely elegant.
`nub'' is an n.log(n) version of `nub' and `group_sort' sorts a list into
strictly ascending order, using a combining function in its arguments to
amalgamate duplicates.
Chris Dornan, 14-Aug-93, 17-Nov-94, 29-Dec-95
------------------------------------------------------------------------------}
module Sort where
-- Hide (<=) so that we don't get name shadowing warnings for it
import Prelude hiding ((<=))
-- `isort' is an insertion sort and is here for historical reasons; msort is
-- better in almost every situation.
isort:: (a->a->Bool) -> [a] -> [a]
isort (<=) = foldr (insrt (<=)) []
insrt:: (a->a->Bool) -> a -> [a] -> [a]
insrt _ e [] = [e]
insrt (<=) e l@(h:t) = if e<=h then e:l else h:insrt (<=) e t
msort :: (a->a->Bool) -> [a] -> [a]
msort _ [] = [] -- (foldb f []) is undefined
msort (<=) xs = foldb (mrg (<=)) (runs (<=) xs)
runs :: (a->a->Bool) -> [a] -> [[a]]
runs (<=) xs0 = foldr op [] xs0
where
op z xss@(xs@(x:_):xss') | z<=x = (z:xs):xss'
| otherwise = [z]:xss
op z xss = [z]:xss
foldb :: (a->a->a) -> [a] -> a
foldb _ [x] = x
foldb f xs0 = foldb f (fold xs0)
where
fold (x1:x2:xs) = f x1 x2 : fold xs
fold xs = xs
mrg:: (a->a->Bool) -> [a] -> [a] -> [a]
mrg _ [] l = l
mrg _ l@(_:_) [] = l
mrg (<=) l1@(h1:t1) l2@(h2:t2) =
if h1<=h2
then h1:mrg (<=) t1 l2
else h2:mrg (<=) l1 t2
nub':: (a->a->Bool) -> [a] -> [a]
nub' (<=) l = group_sort (<=) const l
group_sort:: (a->a->Bool) -> (a->[a]->b) -> [a] -> [b]
group_sort le cmb l = s_m (msort le l)
where
s_m [] = []
s_m (h:t) = cmb h (takeWhile (`le` h) t):s_m (dropWhile (`le` h) t)
alex-3.2.5/src/UTF8.hs 0000644 0000000 0000000 00000001662 07346545000 012521 0 ustar 00 0000000 0000000 module UTF8 where
import Data.Word
import Data.Bits
import Data.Char
{-
-- Could also be imported:
import Codec.Binary.UTF8.Light as UTF8
encode :: Char -> [Word8]
encode c = head (UTF8.encodeUTF8' [UTF8.c2w c])
-}
-- | Encode a Haskell String to a list of Word8 values, in UTF8 format.
encode :: Char -> [Word8]
encode = map fromIntegral . go . ord
where
go oc
| oc <= 0x7f = [oc]
| oc <= 0x7ff = [ 0xc0 + (oc `shiftR` 6)
, 0x80 + oc .&. 0x3f
]
| oc <= 0xffff = [ 0xe0 + (oc `shiftR` 12)
, 0x80 + ((oc `shiftR` 6) .&. 0x3f)
, 0x80 + oc .&. 0x3f
]
| otherwise = [ 0xf0 + (oc `shiftR` 18)
, 0x80 + ((oc `shiftR` 12) .&. 0x3f)
, 0x80 + ((oc `shiftR` 6) .&. 0x3f)
, 0x80 + oc .&. 0x3f
]
alex-3.2.5/src/Util.hs 0000644 0000000 0000000 00000002463 07346545000 012710 0 ustar 00 0000000 0000000 -- -----------------------------------------------------------------------------
--
-- Util.hs, part of Alex
--
-- (c) Simon Marlow 2003
--
-- General utilities used in various parts of Alex
--
-- ----------------------------------------------------------------------------}
module Util
( str
, char
, nl
, paren
, brack
, interleave_shows
, space
, cjustify
, ljustify
, rjustify
, spaces
, hline
) where
-- Pretty-printing utilities
str :: String -> String -> String
str = showString
char :: Char -> String -> String
char c = (c :)
nl :: String -> String
nl = char '\n'
paren :: (String -> String) -> String -> String
paren s = char '(' . s . char ')'
brack :: (String -> String) -> String -> String
brack s = char '[' . s . char ']'
interleave_shows :: (String -> String) -> [String -> String] -> String -> String
interleave_shows _ [] = id
interleave_shows s xs = foldr1 (\a b -> a . s . b) xs
space :: String -> String
space = char ' '
cjustify, ljustify, rjustify :: Int -> String -> String
cjustify n s = spaces halfm ++ s ++ spaces (m - halfm)
where
m = n - length s
halfm = m `div` 2
ljustify n s = s ++ spaces (max 0 (n - length s))
rjustify n s = spaces (n - length s) ++ s
spaces :: Int -> String
spaces n = replicate n ' '
hline :: String
hline = replicate 77 '-'
alex-3.2.5/src/ghc_hooks.c 0000755 0000000 0000000 00000000100 07346545000 013534 0 ustar 00 0000000 0000000 #include
void ErrorHdrHook(chan)
FILE *chan;
{}
alex-3.2.5/templates/ 0000755 0000000 0000000 00000000000 07346545000 012641 5 ustar 00 0000000 0000000 alex-3.2.5/templates/GenericTemplate.hs 0000755 0000000 0000000 00000016265 07346545000 016262 0 ustar 00 0000000 0000000 -- -----------------------------------------------------------------------------
-- ALEX TEMPLATE
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
-- -----------------------------------------------------------------------------
-- INTERNALS and main scanner engine
#ifdef ALEX_GHC
#undef __GLASGOW_HASKELL__
#define ALEX_IF_GHC_GT_500 #if __GLASGOW_HASKELL__ > 500
#define ALEX_IF_GHC_LT_503 #if __GLASGOW_HASKELL__ < 503
#define ALEX_IF_GHC_GT_706 #if __GLASGOW_HASKELL__ > 706
#define ALEX_ELIF_GHC_500 #elif __GLASGOW_HASKELL__ == 500
#define ALEX_IF_BIGENDIAN #ifdef WORDS_BIGENDIAN
#define ALEX_ELSE #else
#define ALEX_ENDIF #endif
#define ALEX_DEFINE #define
#endif
#ifdef ALEX_GHC
#define ILIT(n) n#
#define IBOX(n) (I# (n))
#define FAST_INT Int#
-- Do not remove this comment. Required to fix CPP parsing when using GCC and a clang-compiled alex.
ALEX_IF_GHC_GT_706
ALEX_DEFINE GTE(n,m) (tagToEnum# (n >=# m))
ALEX_DEFINE EQ(n,m) (tagToEnum# (n ==# m))
ALEX_ELSE
ALEX_DEFINE GTE(n,m) (n >=# m)
ALEX_DEFINE EQ(n,m) (n ==# m)
ALEX_ENDIF
#define PLUS(n,m) (n +# m)
#define MINUS(n,m) (n -# m)
#define TIMES(n,m) (n *# m)
#define NEGATE(n) (negateInt# (n))
#define IF_GHC(x) (x)
#else
#define ILIT(n) (n)
#define IBOX(n) (n)
#define FAST_INT Int
#define GTE(n,m) (n >= m)
#define EQ(n,m) (n == m)
#define PLUS(n,m) (n + m)
#define MINUS(n,m) (n - m)
#define TIMES(n,m) (n * m)
#define NEGATE(n) (negate (n))
#define IF_GHC(x)
#endif
#ifdef ALEX_GHC
data AlexAddr = AlexA# Addr#
-- Do not remove this comment. Required to fix CPP parsing when using GCC and a clang-compiled alex.
ALEX_IF_GHC_LT_503
uncheckedShiftL# = shiftL#
ALEX_ENDIF
{-# INLINE alexIndexInt16OffAddr #-}
alexIndexInt16OffAddr (AlexA# arr) off =
ALEX_IF_BIGENDIAN
narrow16Int# i
where
i = word2Int# ((high `uncheckedShiftL#` 8#) `or#` low)
high = int2Word# (ord# (indexCharOffAddr# arr (off' +# 1#)))
low = int2Word# (ord# (indexCharOffAddr# arr off'))
off' = off *# 2#
ALEX_ELSE
indexInt16OffAddr# arr off
ALEX_ENDIF
#else
alexIndexInt16OffAddr arr off = arr ! off
#endif
#ifdef ALEX_GHC
{-# INLINE alexIndexInt32OffAddr #-}
alexIndexInt32OffAddr (AlexA# arr) off =
ALEX_IF_BIGENDIAN
narrow32Int# i
where
i = word2Int# ((b3 `uncheckedShiftL#` 24#) `or#`
(b2 `uncheckedShiftL#` 16#) `or#`
(b1 `uncheckedShiftL#` 8#) `or#` b0)
b3 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 3#)))
b2 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 2#)))
b1 = int2Word# (ord# (indexCharOffAddr# arr (off' +# 1#)))
b0 = int2Word# (ord# (indexCharOffAddr# arr off'))
off' = off *# 4#
ALEX_ELSE
indexInt32OffAddr# arr off
ALEX_ENDIF
#else
alexIndexInt32OffAddr arr off = arr ! off
#endif
#ifdef ALEX_GHC
ALEX_IF_GHC_LT_503
quickIndex arr i = arr ! i
ALEX_ELSE
-- GHC >= 503, unsafeAt is available from Data.Array.Base.
quickIndex = unsafeAt
ALEX_ENDIF
#else
quickIndex arr i = arr ! i
#endif
-- -----------------------------------------------------------------------------
-- Main lexing routines
data AlexReturn a
= AlexEOF
| AlexError !AlexInput
| AlexSkip !AlexInput !Int
| AlexToken !AlexInput !Int a
-- alexScan :: AlexInput -> StartCode -> AlexReturn a
alexScan input__ IBOX(sc)
= alexScanUser undefined input__ IBOX(sc)
alexScanUser user__ input__ IBOX(sc)
= case alex_scan_tkn user__ input__ ILIT(0) input__ sc AlexNone of
(AlexNone, input__') ->
case alexGetByte input__ of
Nothing ->
#ifdef ALEX_DEBUG
trace ("End of input.") $
#endif
AlexEOF
Just _ ->
#ifdef ALEX_DEBUG
trace ("Error.") $
#endif
AlexError input__'
(AlexLastSkip input__'' len, _) ->
#ifdef ALEX_DEBUG
trace ("Skipping.") $
#endif
AlexSkip input__'' len
(AlexLastAcc k input__''' len, _) ->
#ifdef ALEX_DEBUG
trace ("Accept.") $
#endif
AlexToken input__''' len (alex_actions ! k)
-- Push the input through the DFA, remembering the most recent accepting
-- state it encountered.
alex_scan_tkn user__ orig_input len input__ s last_acc =
input__ `seq` -- strict in the input
let
new_acc = (check_accs (alex_accept `quickIndex` IBOX(s)))
in
new_acc `seq`
case alexGetByte input__ of
Nothing -> (new_acc, input__)
Just (c, new_input) ->
#ifdef ALEX_DEBUG
trace ("State: " ++ show IBOX(s) ++ ", char: " ++ show c) $
#endif
case fromIntegral c of { IBOX(ord_c) ->
let
base = alexIndexInt32OffAddr alex_base s
offset = PLUS(base,ord_c)
check = alexIndexInt16OffAddr alex_check offset
new_s = if GTE(offset,ILIT(0)) && EQ(check,ord_c)
then alexIndexInt16OffAddr alex_table offset
else alexIndexInt16OffAddr alex_deflt s
in
case new_s of
ILIT(-1) -> (new_acc, input__)
-- on an error, we want to keep the input *before* the
-- character that failed, not after.
_ -> alex_scan_tkn user__ orig_input (if c < 0x80 || c >= 0xC0 then PLUS(len,ILIT(1)) else len)
-- note that the length is increased ONLY if this is the 1st byte in a char encoding)
new_input new_s new_acc
}
where
check_accs (AlexAccNone) = last_acc
check_accs (AlexAcc a ) = AlexLastAcc a input__ IBOX(len)
check_accs (AlexAccSkip) = AlexLastSkip input__ IBOX(len)
#ifndef ALEX_NOPRED
check_accs (AlexAccPred a predx rest)
| predx user__ orig_input IBOX(len) input__
= AlexLastAcc a input__ IBOX(len)
| otherwise
= check_accs rest
check_accs (AlexAccSkipPred predx rest)
| predx user__ orig_input IBOX(len) input__
= AlexLastSkip input__ IBOX(len)
| otherwise
= check_accs rest
#endif
data AlexLastAcc
= AlexNone
| AlexLastAcc !Int !AlexInput !Int
| AlexLastSkip !AlexInput !Int
data AlexAcc user
= AlexAccNone
| AlexAcc Int
| AlexAccSkip
#ifndef ALEX_NOPRED
| AlexAccPred Int (AlexAccPred user) (AlexAcc user)
| AlexAccSkipPred (AlexAccPred user) (AlexAcc user)
type AlexAccPred user = user -> AlexInput -> Int -> AlexInput -> Bool
-- -----------------------------------------------------------------------------
-- Predicates on a rule
alexAndPred p1 p2 user__ in1 len in2
= p1 user__ in1 len in2 && p2 user__ in1 len in2
--alexPrevCharIsPred :: Char -> AlexAccPred _
alexPrevCharIs c _ input__ _ _ = c == alexInputPrevChar input__
alexPrevCharMatches f _ input__ _ _ = f (alexInputPrevChar input__)
--alexPrevCharIsOneOfPred :: Array Char Bool -> AlexAccPred _
alexPrevCharIsOneOf arr _ input__ _ _ = arr ! alexInputPrevChar input__
--alexRightContext :: Int -> AlexAccPred _
alexRightContext IBOX(sc) user__ _ _ input__ =
case alex_scan_tkn user__ input__ ILIT(0) input__ sc AlexNone of
(AlexNone, _) -> False
_ -> True
-- TODO: there's no need to find the longest
-- match when checking the right context, just
-- the first match will do.
#endif
alex-3.2.5/templates/wrappers.hs 0000755 0000000 0000000 00000037171 07346545000 015054 0 ustar 00 0000000 0000000 -- -----------------------------------------------------------------------------
-- Alex wrapper code.
--
-- This code is in the PUBLIC DOMAIN; you may copy it freely and use
-- it for any purpose whatsoever.
#if defined(ALEX_MONAD) || defined(ALEX_MONAD_BYTESTRING)
import Control.Applicative as App (Applicative (..))
#endif
import Data.Word (Word8)
#if defined(ALEX_BASIC_BYTESTRING) || defined(ALEX_POSN_BYTESTRING) || defined(ALEX_MONAD_BYTESTRING)
import Data.Int (Int64)
import qualified Data.Char
import qualified Data.ByteString.Lazy as ByteString
import qualified Data.ByteString.Internal as ByteString (w2c)
#elif defined(ALEX_STRICT_BYTESTRING)
import qualified Data.Char
import qualified Data.ByteString as ByteString
import qualified Data.ByteString.Internal as ByteString hiding (ByteString)
import qualified Data.ByteString.Unsafe as ByteString
#else
import Data.Char (ord)
import qualified Data.Bits
-- | Encode a Haskell String to a list of Word8 values, in UTF8 format.
utf8Encode :: Char -> [Word8]
utf8Encode = uncurry (:) . utf8Encode'
utf8Encode' :: Char -> (Word8, [Word8])
utf8Encode' c = case go (ord c) of
(x, xs) -> (fromIntegral x, map fromIntegral xs)
where
go oc
| oc <= 0x7f = ( oc
, [
])
| oc <= 0x7ff = ( 0xc0 + (oc `Data.Bits.shiftR` 6)
, [0x80 + oc Data.Bits..&. 0x3f
])
| oc <= 0xffff = ( 0xe0 + (oc `Data.Bits.shiftR` 12)
, [0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
| otherwise = ( 0xf0 + (oc `Data.Bits.shiftR` 18)
, [0x80 + ((oc `Data.Bits.shiftR` 12) Data.Bits..&. 0x3f)
, 0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
#endif
type Byte = Word8
-- -----------------------------------------------------------------------------
-- The input type
#if defined(ALEX_POSN) || defined(ALEX_MONAD) || defined(ALEX_GSCAN)
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
[Byte], -- pending bytes on current char
String) -- current input string
ignorePendingBytes :: AlexInput -> AlexInput
ignorePendingBytes (p,c,_ps,s) = (p,c,[],s)
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (_p,c,_bs,_s) = c
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (p,c,(b:bs),s) = Just (b,(p,c,bs,s))
alexGetByte (_,_,[],[]) = Nothing
alexGetByte (p,_,[],(c:s)) = let p' = alexMove p c
in case utf8Encode' c of
(b, bs) -> p' `seq` Just (b, (p', c, bs, s))
#endif
#if defined(ALEX_POSN_BYTESTRING) || defined(ALEX_MONAD_BYTESTRING)
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
ByteString.ByteString, -- current input string
Int64) -- bytes consumed so far
ignorePendingBytes :: AlexInput -> AlexInput
ignorePendingBytes i = i -- no pending bytes when lexing bytestrings
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (_,c,_,_) = c
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (p,_,cs,n) =
case ByteString.uncons cs of
Nothing -> Nothing
Just (b, cs') ->
let c = ByteString.w2c b
p' = alexMove p c
n' = n+1
in p' `seq` cs' `seq` n' `seq` Just (b, (p', c, cs',n'))
#endif
#ifdef ALEX_BASIC_BYTESTRING
data AlexInput = AlexInput { alexChar :: {-# UNPACK #-} !Char, -- previous char
alexStr :: !ByteString.ByteString, -- current input string
alexBytePos :: {-# UNPACK #-} !Int64} -- bytes consumed so far
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar = alexChar
alexGetByte (AlexInput {alexStr=cs,alexBytePos=n}) =
case ByteString.uncons cs of
Nothing -> Nothing
Just (c, rest) ->
Just (c, AlexInput {
alexChar = ByteString.w2c c,
alexStr = rest,
alexBytePos = n+1})
#endif
#ifdef ALEX_STRICT_BYTESTRING
data AlexInput = AlexInput { alexChar :: {-# UNPACK #-} !Char,
alexStr :: {-# UNPACK #-} !ByteString.ByteString,
alexBytePos :: {-# UNPACK #-} !Int}
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar = alexChar
alexGetByte (AlexInput {alexStr=cs,alexBytePos=n}) =
case ByteString.uncons cs of
Nothing -> Nothing
Just (c, rest) ->
Just (c, AlexInput {
alexChar = ByteString.w2c c,
alexStr = rest,
alexBytePos = n+1})
#endif
-- -----------------------------------------------------------------------------
-- Token positions
-- `Posn' records the location of a token in the input text. It has three
-- fields: the address (number of chacaters preceding the token), line number
-- and column of a token within the file. `start_pos' gives the position of the
-- start of the file and `eof_pos' a standard encoding for the end of file.
-- `move_pos' calculates the new position after traversing a given character,
-- assuming the usual eight character tab stops.
#if defined(ALEX_POSN) || defined(ALEX_MONAD) || defined(ALEX_POSN_BYTESTRING) || defined(ALEX_MONAD_BYTESTRING) || defined(ALEX_GSCAN)
data AlexPosn = AlexPn !Int !Int !Int
deriving (Eq,Show)
alexStartPos :: AlexPosn
alexStartPos = AlexPn 0 1 1
alexMove :: AlexPosn -> Char -> AlexPosn
alexMove (AlexPn a l c) '\t' = AlexPn (a+1) l (c+alex_tab_size-((c-1) `mod` alex_tab_size))
alexMove (AlexPn a l _) '\n' = AlexPn (a+1) (l+1) 1
alexMove (AlexPn a l c) _ = AlexPn (a+1) l (c+1)
#endif
-- -----------------------------------------------------------------------------
-- Monad (default and with ByteString input)
#if defined(ALEX_MONAD) || defined(ALEX_MONAD_BYTESTRING)
data AlexState = AlexState {
alex_pos :: !AlexPosn, -- position at current input location
#ifndef ALEX_MONAD_BYTESTRING
alex_inp :: String, -- the current input
alex_chr :: !Char, -- the character before the input
alex_bytes :: [Byte],
#else /* ALEX_MONAD_BYTESTRING */
alex_bpos:: !Int64, -- bytes consumed so far
alex_inp :: ByteString.ByteString, -- the current input
alex_chr :: !Char, -- the character before the input
#endif /* ALEX_MONAD_BYTESTRING */
alex_scd :: !Int -- the current startcode
#ifdef ALEX_MONAD_USER_STATE
, alex_ust :: AlexUserState -- AlexUserState will be defined in the user program
#endif
}
-- Compile with -funbox-strict-fields for best results!
#ifndef ALEX_MONAD_BYTESTRING
runAlex :: String -> Alex a -> Either String a
runAlex input__ (Alex f)
= case f (AlexState {alex_bytes = [],
#else /* ALEX_MONAD_BYTESTRING */
runAlex :: ByteString.ByteString -> Alex a -> Either String a
runAlex input__ (Alex f)
= case f (AlexState {alex_bpos = 0,
#endif /* ALEX_MONAD_BYTESTRING */
alex_pos = alexStartPos,
alex_inp = input__,
alex_chr = '\n',
#ifdef ALEX_MONAD_USER_STATE
alex_ust = alexInitUserState,
#endif
alex_scd = 0}) of Left msg -> Left msg
Right ( _, a ) -> Right a
newtype Alex a = Alex { unAlex :: AlexState -> Either String (AlexState, a) }
instance Functor Alex where
fmap f a = Alex $ \s -> case unAlex a s of
Left msg -> Left msg
Right (s', a') -> Right (s', f a')
instance Applicative Alex where
pure a = Alex $ \s -> Right (s, a)
fa <*> a = Alex $ \s -> case unAlex fa s of
Left msg -> Left msg
Right (s', f) -> case unAlex a s' of
Left msg -> Left msg
Right (s'', b) -> Right (s'', f b)
instance Monad Alex where
m >>= k = Alex $ \s -> case unAlex m s of
Left msg -> Left msg
Right (s',a) -> unAlex (k a) s'
return = App.pure
alexGetInput :: Alex AlexInput
alexGetInput
#ifndef ALEX_MONAD_BYTESTRING
= Alex $ \s@AlexState{alex_pos=pos,alex_chr=c,alex_bytes=bs,alex_inp=inp__} ->
Right (s, (pos,c,bs,inp__))
#else /* ALEX_MONAD_BYTESTRING */
= Alex $ \s@AlexState{alex_pos=pos,alex_bpos=bpos,alex_chr=c,alex_inp=inp__} ->
Right (s, (pos,c,inp__,bpos))
#endif /* ALEX_MONAD_BYTESTRING */
alexSetInput :: AlexInput -> Alex ()
#ifndef ALEX_MONAD_BYTESTRING
alexSetInput (pos,c,bs,inp__)
= Alex $ \s -> case s{alex_pos=pos,alex_chr=c,alex_bytes=bs,alex_inp=inp__} of
#else /* ALEX_MONAD_BYTESTRING */
alexSetInput (pos,c,inp__,bpos)
= Alex $ \s -> case s{alex_pos=pos,
alex_bpos=bpos,
alex_chr=c,
alex_inp=inp__} of
#endif /* ALEX_MONAD_BYTESTRING */
state__@(AlexState{}) -> Right (state__, ())
alexError :: String -> Alex a
alexError message = Alex $ const $ Left message
alexGetStartCode :: Alex Int
alexGetStartCode = Alex $ \s@AlexState{alex_scd=sc} -> Right (s, sc)
alexSetStartCode :: Int -> Alex ()
alexSetStartCode sc = Alex $ \s -> Right (s{alex_scd=sc}, ())
#if !defined(ALEX_MONAD_BYTESTRING) && defined(ALEX_MONAD_USER_STATE)
alexGetUserState :: Alex AlexUserState
alexGetUserState = Alex $ \s@AlexState{alex_ust=ust} -> Right (s,ust)
alexSetUserState :: AlexUserState -> Alex ()
alexSetUserState ss = Alex $ \s -> Right (s{alex_ust=ss}, ())
#endif /* !defined(ALEX_MONAD_BYTESTRING) && defined(ALEX_MONAD_USER_STATE) */
alexMonadScan = do
#ifndef ALEX_MONAD_BYTESTRING
inp__ <- alexGetInput
#else /* ALEX_MONAD_BYTESTRING */
inp__@(_,_,_,n) <- alexGetInput
#endif /* ALEX_MONAD_BYTESTRING */
sc <- alexGetStartCode
case alexScan inp__ sc of
AlexEOF -> alexEOF
AlexError ((AlexPn _ line column),_,_,_) -> alexError $ "lexical error at line " ++ (show line) ++ ", column " ++ (show column)
AlexSkip inp__' _len -> do
alexSetInput inp__'
alexMonadScan
#ifndef ALEX_MONAD_BYTESTRING
AlexToken inp__' len action -> do
#else /* ALEX_MONAD_BYTESTRING */
AlexToken inp__'@(_,_,_,n') _ action -> let len = n'-n in do
#endif /* ALEX_MONAD_BYTESTRING */
alexSetInput inp__'
action (ignorePendingBytes inp__) len
-- -----------------------------------------------------------------------------
-- Useful token actions
#ifndef ALEX_MONAD_BYTESTRING
type AlexAction result = AlexInput -> Int -> Alex result
#else /* ALEX_MONAD_BYTESTRING */
type AlexAction result = AlexInput -> Int64 -> Alex result
#endif /* ALEX_MONAD_BYTESTRING */
-- just ignore this token and scan another one
-- skip :: AlexAction result
skip _input _len = alexMonadScan
-- ignore this token, but set the start code to a new value
-- begin :: Int -> AlexAction result
begin code _input _len = do alexSetStartCode code; alexMonadScan
-- perform an action for this token, and set the start code to a new value
andBegin :: AlexAction result -> Int -> AlexAction result
(action `andBegin` code) input__ len = do
alexSetStartCode code
action input__ len
#ifndef ALEX_MONAD_BYTESTRING
token :: (AlexInput -> Int -> token) -> AlexAction token
#else /* ALEX_MONAD_BYTESTRING */
token :: (AlexInput -> Int64 -> token) -> AlexAction token
#endif /* ALEX_MONAD_BYTESTRING */
token t input__ len = return (t input__ len)
#endif /* defined(ALEX_MONAD) || defined(ALEX_MONAD_BYTESTRING) */
-- -----------------------------------------------------------------------------
-- Basic wrapper
#ifdef ALEX_BASIC
type AlexInput = (Char,[Byte],String)
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (c,_,_) = c
-- alexScanTokens :: String -> [token]
alexScanTokens str = go ('\n',[],str)
where go inp__@(_,_bs,s) =
case alexScan inp__ 0 of
AlexEOF -> []
AlexError _ -> error "lexical error"
AlexSkip inp__' _ln -> go inp__'
AlexToken inp__' len act -> act (take len s) : go inp__'
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (c,(b:bs),s) = Just (b,(c,bs,s))
alexGetByte (_,[],[]) = Nothing
alexGetByte (_,[],(c:s)) = case utf8Encode' c of
(b, bs) -> Just (b, (c, bs, s))
#endif
-- -----------------------------------------------------------------------------
-- Basic wrapper, ByteString version
#ifdef ALEX_BASIC_BYTESTRING
-- alexScanTokens :: ByteString.ByteString -> [token]
alexScanTokens str = go (AlexInput '\n' str 0)
where go inp__ =
case alexScan inp__ 0 of
AlexEOF -> []
AlexError _ -> error "lexical error"
AlexSkip inp__' _len -> go inp__'
AlexToken inp__' _ act ->
let len = alexBytePos inp__' - alexBytePos inp__ in
act (ByteString.take len (alexStr inp__)) : go inp__'
#endif
#ifdef ALEX_STRICT_BYTESTRING
-- alexScanTokens :: ByteString.ByteString -> [token]
alexScanTokens str = go (AlexInput '\n' str 0)
where go inp__ =
case alexScan inp__ 0 of
AlexEOF -> []
AlexError _ -> error "lexical error"
AlexSkip inp__' _len -> go inp__'
AlexToken inp__' _ act ->
let len = alexBytePos inp__' - alexBytePos inp__ in
act (ByteString.take len (alexStr inp__)) : go inp__'
#endif
-- -----------------------------------------------------------------------------
-- Posn wrapper
-- Adds text positions to the basic model.
#ifdef ALEX_POSN
--alexScanTokens :: String -> [token]
alexScanTokens str0 = go (alexStartPos,'\n',[],str0)
where go inp__@(pos,_,_,str) =
case alexScan inp__ 0 of
AlexEOF -> []
AlexError ((AlexPn _ line column),_,_,_) -> error $ "lexical error at line " ++ (show line) ++ ", column " ++ (show column)
AlexSkip inp__' _ln -> go inp__'
AlexToken inp__' len act -> act pos (take len str) : go inp__'
#endif
-- -----------------------------------------------------------------------------
-- Posn wrapper, ByteString version
#ifdef ALEX_POSN_BYTESTRING
--alexScanTokens :: ByteString.ByteString -> [token]
alexScanTokens str0 = go (alexStartPos,'\n',str0,0)
where go inp__@(pos,_,str,n) =
case alexScan inp__ 0 of
AlexEOF -> []
AlexError ((AlexPn _ line column),_,_,_) -> error $ "lexical error at line " ++ (show line) ++ ", column " ++ (show column)
AlexSkip inp__' _len -> go inp__'
AlexToken inp__'@(_,_,_,n') _ act ->
act pos (ByteString.take (n'-n) str) : go inp__'
#endif
-- -----------------------------------------------------------------------------
-- GScan wrapper
-- For compatibility with previous versions of Alex, and because we can.
#ifdef ALEX_GSCAN
alexGScan stop__ state__ inp__ =
alex_gscan stop__ alexStartPos '\n' [] inp__ (0,state__)
alex_gscan stop__ p c bs inp__ (sc,state__) =
case alexScan (p,c,bs,inp__) sc of
AlexEOF -> stop__ p c inp__ (sc,state__)
AlexError _ -> stop__ p c inp__ (sc,state__)
AlexSkip (p',c',bs',inp__') _len ->
alex_gscan stop__ p' c' bs' inp__' (sc,state__)
AlexToken (p',c',bs',inp__') len k ->
k p c inp__ len (\scs -> alex_gscan stop__ p' c' bs' inp__' scs) (sc,state__)
#endif
alex-3.2.5/test.hs 0000644 0000000 0000000 00000000167 07346545000 012162 0 ustar 00 0000000 0000000 import System.Process (system)
import System.Exit (exitWith)
main = system "make -k -C tests clean all" >>= exitWith
alex-3.2.5/tests/ 0000755 0000000 0000000 00000000000 07346545000 012005 5 ustar 00 0000000 0000000 alex-3.2.5/tests/Makefile 0000755 0000000 0000000 00000004747 07346545000 013464 0 ustar 00 0000000 0000000 # NOTE: `cabal test` will take care to build the local `alex`
# executable and place it into $PATH for us to pick up.
#
# If it doesn't look like the alex binary in $PATH comes from the
# build tree, then we'll fall back to pointing to
# ../dist/build/alex/alex to support running tests via "runghc
# Setup.hs test".
#
ALEX=$(shell which alex)
ifeq "$(filter $(dir $(shell pwd))%,$(ALEX))" ""
ALEX=../dist/build/alex/alex
endif
# NOTE: This assumes that a working `ghc` is on $PATH; this may not necessarily be the same GHC used by `cabal` for building `alex`.
HC=ghc
HC_OPTS=-Wall -fwarn-incomplete-uni-patterns -fno-warn-missing-signatures -fno-warn-unused-imports -fno-warn-tabs -Werror
.PRECIOUS: %.n.hs %.g.hs %.o %.exe %.bin
ifeq "$(TARGETPLATFORM)" "i386-unknown-mingw32"
HS_PROG_EXT = .exe
else
HS_PROG_EXT = .bin
endif
TESTS = \
basic_typeclass.x \
basic_typeclass_bytestring.x \
default_typeclass.x \
gscan_typeclass.x \
monad_typeclass.x \
monad_typeclass_bytestring.x \
monadUserState_typeclass.x \
monadUserState_typeclass_bytestring.x \
null.x \
posn_typeclass.x \
posn_typeclass_bytestring.x \
strict_typeclass.x \
simple.x \
tokens.x \
tokens_bytestring.x \
tokens_bytestring_unicode.x \
tokens_gscan.x \
tokens_monad_bytestring.x \
tokens_monadUserState_bytestring.x \
tokens_posn.x \
tokens_posn_bytestring.x \
tokens_scan_user.x \
tokens_strict_bytestring.x \
unicode.x
# NOTE: `cabal` will set the `alex_datadir` env-var accordingly before invoking the test-suite
#TEST_ALEX_OPTS = --template=../data/
TEST_ALEX_OPTS=
%.n.hs : %.x
$(ALEX) $(TEST_ALEX_OPTS) $< -o $@
%.g.hs : %.x
$(ALEX) $(TEST_ALEX_OPTS) -g $< -o $@
CLEAN_FILES += *.n.hs *.g.hs *.info *.hi *.o *.bin *.exe
ALL_TEST_HS = $(shell echo $(TESTS) | sed -e 's/\([^\. ]*\)\.\(l\)\{0,1\}x/\1.n.hs \1.g.hs/g')
ALL_TESTS = $(patsubst %.hs, %.run, $(ALL_TEST_HS))
%.run : %$(HS_PROG_EXT)
./$<
%$(HS_PROG_EXT) : %.hs
$(HC) $(HC_OPTS) -package array -package bytestring $($*_LD_OPTS) $< -o $@
all :: $(ALL_TESTS)
.PHONY: clean
clean:
rm -f $(CLEAN_FILES)
# NOTE: The `../dist` path belows don't aren't accurate anymore for recent cabals
interact:
ghci -cpp -i../src -i../dist/build/autogen -i../dist/build/alex/alex-tmp Main -fbreak-on-exception
# -args='--template=.. simple.x -o simple.n.hs'
# :set args --template=.. simple.x -o simple.n.hs
alex-3.2.5/tests/basic_typeclass.x 0000755 0000000 0000000 00000003531 07346545000 015353 0 ustar 00 0000000 0000000 {
module Main (main) where
import System.Exit
import Prelude hiding (lex)
}
%wrapper "basic"
%token "Token s"
%typeclass "Read s"
tokens :-
[a-b]+$ { idtoken 0 }
[c-d]+/"." { idtoken 1 }
[e-f]+/{ tokpred } { idtoken 2 }
^[g-h]+$ { idtoken 3 }
^[i-j]+/"." { idtoken 4 }
^[k-l]+/{ tokpred } { idtoken 5 }
[m-n]+$ { idtoken 6 }
[o-p]+/"." { idtoken 7 }
[q-r]+/{ tokpred } { idtoken 8 }
[0-1]^[s-t]+$ { idtoken 9 }
[2-3]^[u-v]+/"." { idtoken 10 }
[4-5]^[w-x]+/{ tokpred } { idtoken 11 }
[y-z]+ { idtoken 12 }
[A-B]+$ ;
[C-D]+/"." ;
[E-F]+/{ tokpred } ;
^[G-H]+$ ;
^[I-J]+/"." ;
^[K-L]+/{ tokpred } ;
[M-N]+$ ;
[O-P]+/"." ;
[Q-R]+/{ tokpred } ;
[0-1]^[S-T]+$ ;
[2-3]^[U-V]+/"." ;
[4-5]^[W-X]+/{ tokpred } ;
[Y-Z]+ ;
\. ;
[ \n\t\r]+ ;
[0-9] ;
{
tokpred :: () -> AlexInput -> Int -> AlexInput -> Bool
tokpred _ _ _ _ = True
idtoken :: Read s => Int -> String -> Token s
idtoken n s = Id n (read ("\"" ++ s ++ "\""))
data Token s = Id Int s
deriving (Show, Ord, Eq)
lex :: Read s => String -> [Token s]
lex = alexScanTokens
input = "abab\ndddc.fff\ngh\nijji.\nllmnm\noop.rq0tsst\n3uuvu.5xxw"
tokens = [ Id 0 "abab", Id 1 "dddc", Id 2 "fff", Id 3 "gh", Id 4 "ijji",
Id 5 "ll", Id 6 "mnm", Id 7 "oop", Id 8 "rq", Id 9 "tsst",
Id 10 "uuvu", Id 11 "xxw"]
main :: IO ()
main =
let
result :: [Token String]
result = lex input
in do
if result /= tokens
then exitFailure
else exitWith ExitSuccess
}
alex-3.2.5/tests/basic_typeclass_bytestring.x 0000755 0000000 0000000 00000003723 07346545000 017630 0 ustar 00 0000000 0000000 {
{-# LANGUAGE OverloadedStrings #-}
module Main (main) where
import System.Exit
import Prelude hiding (lex)
import Data.ByteString.Lazy.Char8 as Lazy
}
%wrapper "basic-bytestring"
%token "Token s"
%typeclass "Read s"
tokens :-
[a-b]+$ { idtoken 0 }
[c-d]+/"." { idtoken 1 }
[e-f]+/{ tokpred } { idtoken 2 }
^[g-h]+$ { idtoken 3 }
^[i-j]+/"." { idtoken 4 }
^[k-l]+/{ tokpred } { idtoken 5 }
[m-n]+$ { idtoken 6 }
[o-p]+/"." { idtoken 7 }
[q-r]+/{ tokpred } { idtoken 8 }
[0-1]^[s-t]+$ { idtoken 9 }
[2-3]^[u-v]+/"." { idtoken 10 }
[4-5]^[w-x]+/{ tokpred } { idtoken 11 }
[y-z]+ { idtoken 12 }
[A-B]+$ ;
[C-D]+/"." ;
[E-F]+/{ tokpred } ;
^[G-H]+$ ;
^[I-J]+/"." ;
^[K-L]+/{ tokpred } ;
[M-N]+$ ;
[O-P]+/"." ;
[Q-R]+/{ tokpred } ;
[0-1]^[S-T]+$ ;
[2-3]^[U-V]+/"." ;
[4-5]^[W-X]+/{ tokpred } ;
[Y-Z]+ ;
\. ;
[ \n\t\r]+ ;
[0-9] ;
{
tokpred :: () -> AlexInput -> Int -> AlexInput -> Bool
tokpred _ _ _ _ = True
idtoken :: Read s => Int -> Lazy.ByteString -> Token s
idtoken n s = Id n (read ("\"" ++ (Lazy.unpack s) ++ "\""))
data Token s = Id Int s
deriving (Show, Ord, Eq)
lex :: Read s => Lazy.ByteString -> [Token s]
lex = alexScanTokens
input = "abab\ndddc.fff\ngh\nijji.\nllmnm\noop.rq0tsst\n3uuvu.5xxw"
tokens = [ Id 0 "abab", Id 1 "dddc", Id 2 "fff", Id 3 "gh", Id 4 "ijji",
Id 5 "ll", Id 6 "mnm", Id 7 "oop", Id 8 "rq", Id 9 "tsst",
Id 10 "uuvu", Id 11 "xxw"]
main :: IO ()
main =
let
result :: [Token String]
result = lex input
in do
if result /= tokens
then exitFailure
else exitWith ExitSuccess
}
alex-3.2.5/tests/default_typeclass.x 0000755 0000000 0000000 00000023071 07346545000 015717 0 ustar 00 0000000 0000000 {
{-# LANGUAGE FlexibleContexts, MultiParamTypeClasses, FunctionalDependencies,
FlexibleInstances #-}
module Main (main) where
import System.Exit
import Prelude hiding (lex)
import qualified Data.Bits
import Control.Applicative
import Control.Monad
import Data.Word
import Data.Char
}
%action "AlexInput -> Int -> m (Token s)"
%typeclass "Read s, MonadState AlexState m"
tokens :-
[a-b]+$ { idtoken 0 }
[c-d]+/"." { idtoken 1 }
[e-f]+/{ tokpred } { idtoken 2 }
^[g-h]+$ { idtoken 3 }
^[i-j]+/"." { idtoken 4 }
^[k-l]+/{ tokpred } { idtoken 5 }
[m-n]+$ { idtoken 6 }
[o-p]+/"." { idtoken 7 }
[q-r]+/{ tokpred } { idtoken 8 }
[0-1]^[s-t]+$ { idtoken 9 }
[2-3]^[u-v]+/"." { idtoken 10 }
[4-5]^[w-x]+/{ tokpred } { idtoken 11 }
[y-z]+ { idtoken 12 }
[A-B]+$ ;
[C-D]+/"." ;
[E-F]+/{ tokpred } ;
^[G-H]+$ ;
^[I-J]+/"." ;
^[K-L]+/{ tokpred } ;
[M-N]+$ ;
[O-P]+/"." ;
[Q-R]+/{ tokpred } ;
[0-1]^[S-T]+$ ;
[2-3]^[U-V]+/"." ;
[4-5]^[W-X]+/{ tokpred } ;
[Y-Z]+ ;
\. ;
[ \n\t\r]+ ;
[0-9] ;
{
-- | Encode a Haskell String to a list of Word8 values, in UTF8 format.
utf8Encode' :: Char -> (Word8, [Word8])
utf8Encode' c = case go (ord c) of
(x, xs) -> (fromIntegral x, map fromIntegral xs)
where
go oc
| oc <= 0x7f = ( oc
, [
])
| oc <= 0x7ff = ( 0xc0 + (oc `Data.Bits.shiftR` 6)
, [0x80 + oc Data.Bits..&. 0x3f
])
| oc <= 0xffff = ( 0xe0 + (oc `Data.Bits.shiftR` 12)
, [0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
| otherwise = ( 0xf0 + (oc `Data.Bits.shiftR` 18)
, [0x80 + ((oc `Data.Bits.shiftR` 12) Data.Bits..&. 0x3f)
, 0x80 + ((oc `Data.Bits.shiftR` 6) Data.Bits..&. 0x3f)
, 0x80 + oc Data.Bits..&. 0x3f
])
type Byte = Word8
data AlexState = AlexState {
alex_pos :: !AlexPosn, -- position at current input location
alex_inp :: String, -- the current input
alex_chr :: !Char, -- the character before the input
alex_bytes :: [Byte],
alex_scd :: !Int, -- the current startcode
alex_errs :: [String]
}
type AlexInput = (AlexPosn, -- current position,
Char, -- previous char
[Byte], -- pending bytes on current char
String) -- current input string
ignorePendingBytes :: AlexInput -> AlexInput
ignorePendingBytes (p,c,_,s) = (p,c,[],s)
alexInputPrevChar :: AlexInput -> Char
alexInputPrevChar (_,c,_,_) = c
alexGetByte :: AlexInput -> Maybe (Byte,AlexInput)
alexGetByte (p,c,(b:bs),s) = Just (b,(p,c,bs,s))
alexGetByte (_,_,[],[]) = Nothing
alexGetByte (p,_,[],(c:s)) = let p' = alexMove p c
in case utf8Encode' c of
(b, bs) -> p' `seq` Just (b, (p', c, bs, s))
data AlexPosn = AlexPn !Int !Int !Int
deriving (Eq,Show)
alexStartPos :: AlexPosn
alexStartPos = AlexPn 0 1 1
alexMove :: AlexPosn -> Char -> AlexPosn
alexMove (AlexPn a l c) '\t' = AlexPn (a+1) l (((c+7) `div` 8)*8+1)
alexMove (AlexPn a l _) '\n' = AlexPn (a+1) (l+1) 1
alexMove (AlexPn a l c) _ = AlexPn (a+1) l (c+1)
alexGetInput :: MonadState AlexState m => m AlexInput
alexGetInput =
do
AlexState { alex_pos = pos, alex_chr = c,
alex_bytes = bs, alex_inp = inp } <- get
return (pos, c, bs, inp)
alexSetInput :: MonadState AlexState m => AlexInput -> m ()
alexSetInput (pos, c, bs, inp) =
do
s <- get
put s { alex_pos = pos, alex_chr = c,
alex_bytes = bs, alex_inp = inp }
alexError :: (MonadState AlexState m, Read s) => String -> m (Token s)
alexError message =
do
s @ AlexState { alex_errs = errs } <- get
put s { alex_errs = message : errs }
alexMonadScan
alexGetStartCode :: MonadState AlexState m => m Int
alexGetStartCode =
do
AlexState{ alex_scd = sc } <- get
return sc
alexSetStartCode :: MonadState AlexState m => Int -> m ()
alexSetStartCode sc =
do
s <- get
put s { alex_scd = sc }
alexMonadScan :: (MonadState AlexState m, Read s) => m (Token s)
alexMonadScan = do
inp <- alexGetInput
sc <- alexGetStartCode
case alexScan inp sc of
AlexEOF -> alexEOF
AlexError ((AlexPn _ line column),_,_,_) ->
alexError $ "lexical error at line " ++ (show line) ++
", column " ++ (show column)
AlexSkip inp' _ -> do
alexSetInput inp'
alexMonadScan
AlexToken inp' len action -> do
alexSetInput inp'
action (ignorePendingBytes inp) len
alexEOF :: MonadState AlexState m => m (Token s)
alexEOF = return EOF
tokpred :: () -> AlexInput -> Int -> AlexInput -> Bool
tokpred _ _ _ _ = True
idtoken :: (Read s, MonadState AlexState m) =>
Int -> AlexInput -> Int -> m (Token s)
idtoken n (_, _, _, s) len = return (Id n (read ("\"" ++ take len s ++ "\"")))
data Token s = Id Int s | EOF deriving Eq
lex :: (MonadState AlexState m, Read s) => m [Token s]
lex =
do
res <- alexMonadScan
case res of
EOF -> return []
tok ->
do
rest <- lex
return (tok : rest)
input = "abab\ndddc.fff\ngh\nijji.\nllmnm\noop.rq0tsst\n3uuvu.5xxw"
tokens = [ Id 0 "abab", Id 1 "dddc", Id 2 "fff", Id 3 "gh", Id 4 "ijji",
Id 5 "ll", Id 6 "mnm", Id 7 "oop", Id 8 "rq", Id 9 "tsst",
Id 10 "uuvu", Id 11 "xxw"]
main :: IO ()
main =
do
(result, _) <- runStateT lex AlexState { alex_pos = alexStartPos,
alex_inp = input,
alex_chr = '\n',
alex_bytes = [],
alex_scd = 0,
alex_errs= [] }
if result /= tokens
then exitFailure
else exitWith ExitSuccess
-- | Minimal definition is either both of @get@ and @put@ or just @state@
class Monad m => MonadState s m | m -> s where
-- | Return the state from the internals of the monad.
get :: m s
get = state (\s -> (s, s))
-- | Replace the state inside the monad.
put :: s -> m ()
put s = state (\_ -> ((), s))
-- | Embed a simple state action into the monad.
state :: (s -> (a, s)) -> m a
state f = do
s <- get
let ~(a, s') = f s
put s'
return a
-- | Construct a state monad computation from a function.
-- (The inverse of 'runState'.)
state' :: Monad m
=> (s -> (a, s)) -- ^pure state transformer
-> StateT s m a -- ^equivalent state-passing computation
state' f = StateT (return . f)
-- ---------------------------------------------------------------------------
-- | A state transformer monad parameterized by:
--
-- * @s@ - The state.
--
-- * @m@ - The inner monad.
--
-- The 'return' function leaves the state unchanged, while @>>=@ uses
-- the final state of the first computation as the initial state of
-- the second.
newtype StateT s m a = StateT { runStateT :: s -> m (a,s) }
-- | Evaluate a state computation with the given initial state
-- and return the final value, discarding the final state.
--
-- * @'evalStateT' m s = 'liftM' 'fst' ('runStateT' m s)@
evalStateT :: (Monad m) => StateT s m a -> s -> m a
evalStateT m s = do
(a, _) <- runStateT m s
return a
-- | Evaluate a state computation with the given initial state
-- and return the final state, discarding the final value.
--
-- * @'execStateT' m s = 'liftM' 'snd' ('runStateT' m s)@
execStateT :: (Monad m) => StateT s m a -> s -> m s
execStateT m s = do
(_, s') <- runStateT m s
return s'
-- | Map both the return value and final state of a computation using
-- the given function.
--
-- * @'runStateT' ('mapStateT' f m) = f . 'runStateT' m@
mapStateT :: (m (a, s) -> n (b, s)) -> StateT s m a -> StateT s n b
mapStateT f m = StateT $ f . runStateT m
-- | @'withStateT' f m@ executes action @m@ on a state modified by
-- applying @f@.
--
-- * @'withStateT' f m = 'modify' f >> m@
withStateT :: (s -> s) -> StateT s m a -> StateT s m a
withStateT f m = StateT $ runStateT m . f
instance (Functor m) => Functor (StateT s m) where
fmap f m = StateT $ \ s ->
fmap (\ (a, s') -> (f a, s')) $ runStateT m s
instance (Monad m) => Monad (StateT s m) where
return a = state $ \s -> (a, s)
m >>= k = StateT $ \s -> do
(a, s') <- runStateT m s
runStateT (k a) s'
-- | Fetch the current value of the state within the monad.
get' :: (Monad m) => StateT s m s
get' = state $ \s -> (s, s)
-- | @'put' s@ sets the state within the monad to @s@.
put' :: (Monad m) => s -> StateT s m ()
put' s = state $ \_ -> ((), s)
-- | @'modify' f@ is an action that updates the state to the result of
-- applying @f@ to the current state.
--
-- * @'modify' f = 'get' >>= ('put' . f)@
modify' :: (Monad m) => (s -> s) -> StateT s m ()
modify' f = state $ \s -> ((), f s)
instance Monad m => MonadState s (StateT s m) where
get = get'
put = put'
state = state'
instance (Functor m, Monad m) => Applicative (StateT s m) where
pure = return
(<*>) = ap
}
alex-3.2.5/tests/gscan_typeclass.x 0000755 0000000 0000000 00000003755 07346545000 015375 0 ustar 00 0000000 0000000 {
module Main (main) where
import System.Exit
import Prelude hiding (lex)
}
%wrapper "gscan"
%token "[Token s]"
%typeclass "Read s"
tokens :-
[a-b]+$ { idtoken 0 }
[c-d]+/"." { idtoken 1 }
[e-f]+/{ tokpred } { idtoken 2 }
^[g-h]+$ { idtoken 3 }
^[i-j]+/"." { idtoken 4 }
^[k-l]+/{ tokpred } { idtoken 5 }
[m-n]+$ { idtoken 6 }
[o-p]+/"." { idtoken 7 }
[q-r]+/{ tokpred } { idtoken 8 }
[0-1]^[s-t]+$ { idtoken 9 }
[2-3]^[u-v]+/"." { idtoken 10 }
[4-5]^[w-x]+/{ tokpred } { idtoken 11 }
[y-z]+ { idtoken 12 }
[A-B]+$ ;
[C-D]+/"." ;
[E-F]+/{ tokpred } ;
^[G-H]+$ ;
^[I-J]+/"." ;
^[K-L]+/{ tokpred } ;
[M-N]+$ ;
[O-P]+/"." ;
[Q-R]+/{ tokpred } ;
[0-1]^[S-T]+$ ;
[2-3]^[U-V]+/"." ;
[4-5]^[W-X]+/{ tokpred } ;
[Y-Z]+ ;
\. ;
[ \n\t\r]+ ;
[0-9] ;
{
tokpred :: () -> AlexInput -> Int -> AlexInput -> Bool
tokpred _ _ _ _ = True
idtoken :: Read s => Int -> AlexPosn -> Char -> String -> Int ->
((Int,state) -> [Token s]) -> (Int,state) -> [Token s]
idtoken n _ _ s len cont st = Id n (read ("\"" ++ take len s ++ "\"")) : cont st
data Token s = Id Int s deriving Eq
lex :: Read s => String -> [Token s]
lex str = alexGScan (\_ _ _ _ -> []) (0 :: Int) str
input = "abab\ndddc.fff\ngh\nijji.\nllmnm\noop.rq0tsst\n3uuvu.5xxw"
tokens = [ Id 0 "abab", Id 1 "dddc", Id 2 "fff", Id 3 "gh", Id 4 "ijji",
Id 5 "ll", Id 6 "mnm", Id 7 "oop", Id 8 "rq", Id 9 "tsst",
Id 10 "uuvu", Id 11 "xxw"]
main :: IO ()
main =
let
result :: [Token String]
result = lex input
in do
if result /= tokens
then exitFailure
else exitWith ExitSuccess
}
alex-3.2.5/tests/monadUserState_typeclass.x 0000755 0000000 0000000 00000004436 07346545000 017235 0 ustar 00 0000000 0000000 {
module Main (main) where
import System.Exit
import Prelude hiding (lex)
}
%wrapper "monadUserState"
%token "Token s"
%typeclass "Read s"
tokens :-
[a-b]+$ { idtoken 0 }
[c-d]+/"." { idtoken 1 }
[e-f]+/{ tokpred } { idtoken 2 }
^[g-h]+$ { idtoken 3 }
^[i-j]+/"." { idtoken 4 }
^[k-l]+/{ tokpred } { idtoken 5 }
[m-n]+$ { idtoken 6 }
[o-p]+/"." { idtoken 7 }
[q-r]+/{ tokpred } { idtoken 8 }
[0-1]^[s-t]+$ { idtoken 9 }
[2-3]^[u-v]+/"." { idtoken 10 }
[4-5]^[w-x]+/{ tokpred } { idtoken 11 }
[y-z]+ { idtoken 12 }
[A-B]+$ ;
[C-D]+/"." ;
[E-F]+/{ tokpred } ;
^[G-H]+$ ;
^[I-J]+/"." ;
^[K-L]+/{ tokpred } ;
[M-N]+$ ;
[O-P]+/"." ;
[Q-R]+/{ tokpred } ;
[0-1]^[S-T]+$ ;
[2-3]^[U-V]+/"." ;
[4-5]^[W-X]+/{ tokpred } ;
[Y-Z]+ ;
\. ;
[ \n\t\r]+ ;
[0-9] ;
{
type AlexUserState = Int
alexInitUserState = 0
alexEOF :: Alex (Token s)
alexEOF = return EOF
tokpred :: AlexUserState -> AlexInput -> Int -> AlexInput -> Bool
tokpred _ _ _ _ = True
idtoken :: Read s => Int -> AlexInput -> Int -> Alex (Token s)
idtoken n (_, _, _, s) len = return (Id n (read ("\"" ++ take len s ++ "\"")))
data Token s = Id Int s | EOF deriving Eq
lex :: Read s => String -> Either String [Token s]
lex inp =
let
lexAll =
do
res <- alexMonadScan
case res of
EOF -> return []
tok ->
do
rest <- lexAll
return (tok : rest)
in
runAlex inp lexAll
input = "abab\ndddc.fff\ngh\nijji.\nllmnm\noop.rq0tsst\n3uuvu.5xxw"
tokens = [ Id 0 "abab", Id 1 "dddc", Id 2 "fff", Id 3 "gh", Id 4 "ijji",
Id 5 "ll", Id 6 "mnm", Id 7 "oop", Id 8 "rq", Id 9 "tsst",
Id 10 "uuvu", Id 11 "xxw"]
main :: IO ()
main =
let
result = lex input
in do
case result of
Left _ -> exitFailure
Right toks ->
do
if toks /= tokens
then exitFailure
else exitWith ExitSuccess
}
alex-3.2.5/tests/monadUserState_typeclass_bytestring.x 0000755 0000000 0000000 00000004762 07346545000 021511 0 ustar 00 0000000 0000000 {
{-# LANGUAGE OverloadedStrings #-}
module Main (main) where
import System.Exit
import Prelude hiding (lex)
import qualified Data.ByteString.Lazy.Char8 as Lazy
}
%wrapper "monadUserState-bytestring"
%token "Token s"
%typeclass "Read s"
tokens :-
[a-b]+$ { idtoken 0 }
[c-d]+/"." { idtoken 1 }
[e-f]+/{ tokpred } { idtoken 2 }
^[g-h]+$ { idtoken 3 }
^[i-j]+/"." { idtoken 4 }
^[k-l]+/{ tokpred } { idtoken 5 }
[m-n]+$ { idtoken 6 }
[o-p]+/"." { idtoken 7 }
[q-r]+/{ tokpred } { idtoken 8 }
[0-1]^[s-t]+$ { idtoken 9 }
[2-3]^[u-v]+/"." { idtoken 10 }
[4-5]^[w-x]+/{ tokpred } { idtoken 11 }
[y-z]+ { idtoken 12 }
[A-B]+$ ;
[C-D]+/"." ;
[E-F]+/{ tokpred } ;
^[G-H]+$ ;
^[I-J]+/"." ;
^[K-L]+/{ tokpred } ;
[M-N]+$ ;
[O-P]+/"." ;
[Q-R]+/{ tokpred } ;
[0-1]^[S-T]+$ ;
[2-3]^[U-V]+/"." ;
[4-5]^[W-X]+/{ tokpred } ;
[Y-Z]+ ;
\. ;
[ \n\t\r]+ ;
[0-9] ;
{
type AlexUserState = Int
alexInitUserState = 0
alexEOF :: Alex (Token s)
alexEOF = return EOF
tokpred :: AlexUserState -> AlexInput -> Int -> AlexInput -> Bool
tokpred _ _ _ _ = True
idtoken :: Read s => Int -> AlexInput -> Int64 -> Alex (Token s)
idtoken n (_, _, s, _) len =
return (Id n (read ("\"" ++ Lazy.unpack (Lazy.take (fromIntegral len) s) ++
"\"")))
data Token s = Id Int s | EOF deriving Eq
lex :: Read s => Lazy.ByteString -> Either String [Token s]
lex inp =
let
lexAll =
do
res <- alexMonadScan
case res of
EOF -> return []
tok ->
do
rest <- lexAll
return (tok : rest)
in
runAlex inp lexAll
input = "abab\ndddc.fff\ngh\nijji.\nllmnm\noop.rq0tsst\n3uuvu.5xxw"
tokens = [ Id 0 "abab", Id 1 "dddc", Id 2 "fff", Id 3 "gh", Id 4 "ijji",
Id 5 "ll", Id 6 "mnm", Id 7 "oop", Id 8 "rq", Id 9 "tsst",
Id 10 "uuvu", Id 11 "xxw"]
main :: IO ()
main =
let
result :: Either String [Token String]
result = lex input
in do
case result of
Left _ -> exitFailure
Right toks ->
do
if toks /= tokens
then exitFailure
else exitWith ExitSuccess
}
alex-3.2.5/tests/monad_typeclass.x 0000755 0000000 0000000 00000004331 07346545000 015367 0 ustar 00 0000000 0000000 {
module Main (main) where
import System.Exit
import Prelude hiding (lex)
}
%wrapper "monad"
%token "Token s"
%typeclass "Read s"
tokens :-
[a-b]+$ { idtoken 0 }
[c-d]+/"." { idtoken 1 }
[e-f]+/{ tokpred } { idtoken 2 }
^[g-h]+$ { idtoken 3 }
^[i-j]+/"." { idtoken 4 }
^[k-l]+/{ tokpred } { idtoken 5 }
[m-n]+$ { idtoken 6 }
[o-p]+/"." { idtoken 7 }
[q-r]+/{ tokpred } { idtoken 8 }
[0-1]^[s-t]+$ { idtoken 9 }
[2-3]^[u-v]+/"." { idtoken 10 }
[4-5]^[w-x]+/{ tokpred } { idtoken 11 }
[y-z]+ { idtoken 12 }
[A-B]+$ ;
[C-D]+/"." ;
[E-F]+/{ tokpred } ;
^[G-H]+$ ;
^[I-J]+/"." ;
^[K-L]+/{ tokpred } ;
[M-N]+$ ;
[O-P]+/"." ;
[Q-R]+/{ tokpred } ;
[0-1]^[S-T]+$ ;
[2-3]^[U-V]+/"." ;
[4-5]^[W-X]+/{ tokpred } ;
[Y-Z]+ ;
\. ;
[ \n\t\r]+ ;
[0-9] ;
{
alexEOF :: Alex (Token s)
alexEOF = return EOF
tokpred :: () -> AlexInput -> Int -> AlexInput -> Bool
tokpred _ _ _ _ = True
idtoken :: Read s => Int -> AlexInput -> Int -> Alex (Token s)
idtoken n (_, _, _, s) len = return (Id n (read ("\"" ++ take len s ++ "\"")))
data Token s = Id Int s | EOF deriving Eq
lex :: Read s => String -> Either String [Token s]
lex inp =
let
lexAll =
do
res <- alexMonadScan
case res of
EOF -> return []
tok ->
do
rest <- lexAll
return (tok : rest)
in
runAlex inp lexAll
input = "abab\ndddc.fff\ngh\nijji.\nllmnm\noop.rq0tsst\n3uuvu.5xxw"
tokens = [ Id 0 "abab", Id 1 "dddc", Id 2 "fff", Id 3 "gh", Id 4 "ijji",
Id 5 "ll", Id 6 "mnm", Id 7 "oop", Id 8 "rq", Id 9 "tsst",
Id 10 "uuvu", Id 11 "xxw"]
main :: IO ()
main =
let
result = lex input
in do
case result of
Left _ -> exitFailure
Right toks ->
do
if toks /= tokens
then exitFailure
else exitWith ExitSuccess
}
alex-3.2.5/tests/monad_typeclass_bytestring.x 0000755 0000000 0000000 00000004655 07346545000 017652 0 ustar 00 0000000 0000000 {
{-# LANGUAGE OverloadedStrings #-}
module Main (main) where
import System.Exit
import Prelude hiding (lex)
import qualified Data.ByteString.Lazy.Char8 as Lazy
}
%wrapper "monad-bytestring"
%token "Token s"
%typeclass "Read s"
tokens :-
[a-b]+$ { idtoken 0 }
[c-d]+/"." { idtoken 1 }
[e-f]+/{ tokpred } { idtoken 2 }
^[g-h]+$ { idtoken 3 }
^[i-j]+/"." { idtoken 4 }
^[k-l]+/{ tokpred } { idtoken 5 }
[m-n]+$ { idtoken 6 }
[o-p]+/"." { idtoken 7 }
[q-r]+/{ tokpred } { idtoken 8 }
[0-1]^[s-t]+$ { idtoken 9 }
[2-3]^[u-v]+/"." { idtoken 10 }
[4-5]^[w-x]+/{ tokpred } { idtoken 11 }
[y-z]+ { idtoken 12 }
[A-B]+$ ;
[C-D]+/"." ;
[E-F]+/{ tokpred } ;
^[G-H]+$ ;
^[I-J]+/"." ;
^[K-L]+/{ tokpred } ;
[M-N]+$ ;
[O-P]+/"." ;
[Q-R]+/{ tokpred } ;
[0-1]^[S-T]+$ ;
[2-3]^[U-V]+/"." ;
[4-5]^[W-X]+/{ tokpred } ;
[Y-Z]+ ;
\. ;
[ \n\t\r]+ ;
[0-9] ;
{
alexEOF :: Alex (Token s)
alexEOF = return EOF
tokpred :: () -> AlexInput -> Int -> AlexInput -> Bool
tokpred _ _ _ _ = True
idtoken :: Read s => Int -> AlexInput -> Int64 -> Alex (Token s)
idtoken n (_, _, s, _) len =
return (Id n (read ("\"" ++ Lazy.unpack (Lazy.take (fromIntegral len) s) ++
"\"")))
data Token s = Id Int s | EOF deriving Eq
lex :: Read s => Lazy.ByteString -> Either String [Token s]
lex inp =
let
lexAll =
do
res <- alexMonadScan
case res of
EOF -> return []
tok ->
do
rest <- lexAll
return (tok : rest)
in
runAlex inp lexAll
input = "abab\ndddc.fff\ngh\nijji.\nllmnm\noop.rq0tsst\n3uuvu.5xxw"
tokens = [ Id 0 "abab", Id 1 "dddc", Id 2 "fff", Id 3 "gh", Id 4 "ijji",
Id 5 "ll", Id 6 "mnm", Id 7 "oop", Id 8 "rq", Id 9 "tsst",
Id 10 "uuvu", Id 11 "xxw"]
main :: IO ()
main =
let
result :: Either String [Token String]
result = lex input
in do
case result of
Left _ -> exitFailure
Right toks ->
do
if toks /= tokens
then exitFailure
else exitWith ExitSuccess
}
alex-3.2.5/tests/null.x 0000755 0000000 0000000 00000002623 07346545000 013156 0 ustar 00 0000000 0000000 {
-- Tests the basic operation.
module Main where
import Data.Char (toUpper)
import Control.Monad
import System.Exit
import System.IO
import Prelude hiding (null)
}
%wrapper "monad"
@word = [A-Za-z]+
@null = \0
$escchars = [abfnrtv\\"\'&]
@escape = \\ ($escchars | \0)
@gap = \\ $white+ \\
@string = $printable # [\"] | " " | @escape | @gap
@inComment = ([^\*] | $white)+ | ([\*]+ ([\x00-\xff] # [\/]))
tokens :-
$white+ ;
<0> {
@null { null }
@word { word }
\" @string \" { string }
"--" @inComment \n { word }
}
{
{- we can now have comments in source code? -}
word (_,_,_,input) len = return (take len input)
null (_,_,_,_) _ = return "\0"
string (_,_,_,input) _ = return (drop 1 (reverse (drop 1 (reverse input))))
alexEOF = return "stopped."
scanner str = runAlex str $ do
let loop = do tok <- alexMonadScan
if tok == "stopped." || tok == "error."
then return [tok]
else do toks <- loop
return (tok:toks)
loop
main = do
let test1 = scanner str1
when (test1 /= out1) $
do hPutStrLn stderr "Test 1 failed:"
print test1
exitFailure
let test2 = scanner str2
when (test2 /= out2) $
do hPutStrLn stderr "Test 2 failed:"
print test2
exitFailure
str1 = "a\0bb\0ccc\0\0\"\\\0\""
out1 = Right ["a","\NUL","bb","\NUL","ccc","\NUL","\NUL","\\\NUL", "stopped."]
str2 = "."
out2 = Left "lexical error at line 1, column 1"
}
alex-3.2.5/tests/posn_typeclass.x 0000755 0000000 0000000 00000003546 07346545000 015257 0 ustar 00 0000000 0000000 {
module Main (main) where
import System.Exit
import Prelude hiding (lex)
}
%wrapper "posn"
%token "Token s"
%typeclass "Read s"
tokens :-
[a-b]+$ { idtoken 0 }
[c-d]+/"." { idtoken 1 }
[e-f]+/{ tokpred } { idtoken 2 }
^[g-h]+$ { idtoken 3 }
^[i-j]+/"." { idtoken 4 }
^[k-l]+/{ tokpred } { idtoken 5 }
[m-n]+$ { idtoken 6 }
[o-p]+/"." { idtoken 7 }
[q-r]+/{ tokpred } { idtoken 8 }
[0-1]^[s-t]+$ { idtoken 9 }
[2-3]^[u-v]+/"." { idtoken 10 }
[4-5]^[w-x]+/{ tokpred } { idtoken 11 }
[y-z]+ { idtoken 12 }
[A-B]+$ ;
[C-D]+/"." ;
[E-F]+/{ tokpred } ;
^[G-H]+$ ;
^[I-J]+/"." ;
^[K-L]+/{ tokpred } ;
[M-N]+$ ;
[O-P]+/"." ;
[Q-R]+/{ tokpred } ;
[0-1]^[S-T]+$ ;
[2-3]^[U-V]+/"." ;
[4-5]^[W-X]+/{ tokpred } ;
[Y-Z]+ ;
\. ;
[ \n\t\r]+ ;
[0-9] ;
{
tokpred :: () -> AlexInput -> Int -> AlexInput -> Bool
tokpred _ _ _ _ = True
idtoken :: Read s => Int -> AlexPosn -> String -> Token s
idtoken n _ s = Id n (read ("\"" ++ s ++ "\""))
data Token s = Id Int s
deriving (Show, Ord, Eq)
lex :: Read s => String -> [Token s]
lex = alexScanTokens
input = "abab\ndddc.fff\ngh\nijji.\nllmnm\noop.rq0tsst\n3uuvu.5xxw"
tokens = [ Id 0 "abab", Id 1 "dddc", Id 2 "fff", Id 3 "gh", Id 4 "ijji",
Id 5 "ll", Id 6 "mnm", Id 7 "oop", Id 8 "rq", Id 9 "tsst",
Id 10 "uuvu", Id 11 "xxw"]
main :: IO ()
main =
let
result :: [Token String]
result = lex input
in do
if result /= tokens
then exitFailure
else exitWith ExitSuccess
}
alex-3.2.5/tests/posn_typeclass_bytestring.x 0000755 0000000 0000000 00000003740 07346545000 017525 0 ustar 00 0000000 0000000 {
{-# LANGUAGE OverloadedStrings #-}
module Main (main) where
import System.Exit
import Prelude hiding (lex)
import Data.ByteString.Lazy.Char8 as Lazy
}
%wrapper "posn-bytestring"
%token "Token s"
%typeclass "Read s"
tokens :-
[a-b]+$ { idtoken 0 }
[c-d]+/"." { idtoken 1 }
[e-f]+/{ tokpred } { idtoken 2 }
^[g-h]+$ { idtoken 3 }
^[i-j]+/"." { idtoken 4 }
^[k-l]+/{ tokpred } { idtoken 5 }
[m-n]+$ { idtoken 6 }
[o-p]+/"." { idtoken 7 }
[q-r]+/{ tokpred } { idtoken 8 }
[0-1]^[s-t]+$ { idtoken 9 }
[2-3]^[u-v]+/"." { idtoken 10 }
[4-5]^[w-x]+/{ tokpred } { idtoken 11 }
[y-z]+ { idtoken 12 }
[A-B]+$ ;
[C-D]+/"." ;
[E-F]+/{ tokpred } ;
^[G-H]+$ ;
^[I-J]+/"." ;
^[K-L]+/{ tokpred } ;
[M-N]+$ ;
[O-P]+/"." ;
[Q-R]+/{ tokpred } ;
[0-1]^[S-T]+$ ;
[2-3]^[U-V]+/"." ;
[4-5]^[W-X]+/{ tokpred } ;
[Y-Z]+ ;
\. ;
[ \n\t\r]+ ;
[0-9] ;
{
tokpred :: () -> AlexInput -> Int -> AlexInput -> Bool
tokpred _ _ _ _ = True
idtoken :: Read s => Int -> AlexPosn -> Lazy.ByteString -> Token s
idtoken n _ s = Id n (read ("\"" ++ (Lazy.unpack s) ++ "\""))
data Token s = Id Int s
deriving (Show, Ord, Eq)
lex :: Read s => Lazy.ByteString -> [Token s]
lex = alexScanTokens
input = "abab\ndddc.fff\ngh\nijji.\nllmnm\noop.rq0tsst\n3uuvu.5xxw"
tokens = [ Id 0 "abab", Id 1 "dddc", Id 2 "fff", Id 3 "gh", Id 4 "ijji",
Id 5 "ll", Id 6 "mnm", Id 7 "oop", Id 8 "rq", Id 9 "tsst",
Id 10 "uuvu", Id 11 "xxw"]
main :: IO ()
main =
let
result :: [Token String]
result = lex input
in do
if result /= tokens
then exitFailure
else exitWith ExitSuccess
}
alex-3.2.5/tests/simple.x 0000755 0000000 0000000 00000003067 07346545000 013500 0 ustar 00 0000000 0000000 {
-- Tests the basic operation.
module Main where
import Data.Char (toUpper)
import Control.Monad
import System.Exit
import System.IO
}
%wrapper "monad"
@word = [A-Za-z]+
tokens :-
$white+ ;
<0> {
"magic" { magic } -- should override later patterns
^ @word $ { both } -- test both trailing and left context
@word $ { eol } -- test trailing context
^ @word { bol } -- test left context
@word { word }
}
<0> \( { begin parens }
[A-Za-z]+ { parenword }
\) { begin 0 }
{
{- we can now have comments in source code? -}
word (_,_,_,input) len = return (take len input)
both (_,_,_,input) len = return ("BOTH:"++ take len input)
eol (_,_,_,input) len = return ("EOL:"++ take len input)
bol (_,_,_,input) len = return ("BOL:"++ take len input)
parenword (_,_,_,input) len = return (map toUpper (take len input))
magic (_,_,_,_) _ = return "PING!"
alexEOF = return "stopped."
scanner str = runAlex str $ do
let loop = do tok <- alexMonadScan
if tok == "stopped." || tok == "error."
then return [tok]
else do toks <- loop
return (tok:toks)
loop
main = do
let test1 = scanner str1
when (test1 /= out1) $
do hPutStrLn stderr "Test 1 failed:"
print test1
exitFailure
let test2 = scanner str2
when (test2 /= out2) $
do hPutStrLn stderr "Test 2 failed:"
print test2
exitFailure
str1 = "a b c (d e f) magic (magic) eol\nbol \nboth\n"
out1 = Right ["BOL:a","b","c","D","E","F","PING!","MAGIC","EOL:eol", "BOL:bol", "BOTH:both", "stopped."]
str2 = "."
out2 = Left "lexical error at line 1, column 1"
}
alex-3.2.5/tests/strict_typeclass.x 0000755 0000000 0000000 00000003710 07346545000 015601 0 ustar 00 0000000 0000000 {
{-# LANGUAGE OverloadedStrings #-}
module Main (main) where
import System.Exit
import Prelude hiding (lex)
import Data.ByteString.Char8 as Strict
}
%wrapper "strict-bytestring"
%token "Token s"
%typeclass "Read s"
tokens :-
[a-b]+$ { idtoken 0 }
[c-d]+/"." { idtoken 1 }
[e-f]+/{ tokpred } { idtoken 2 }
^[g-h]+$ { idtoken 3 }
^[i-j]+/"." { idtoken 4 }
^[k-l]+/{ tokpred } { idtoken 5 }
[m-n]+$ { idtoken 6 }
[o-p]+/"." { idtoken 7 }
[q-r]+/{ tokpred } { idtoken 8 }
[0-1]^[s-t]+$ { idtoken 9 }
[2-3]^[u-v]+/"." { idtoken 10 }
[4-5]^[w-x]+/{ tokpred } { idtoken 11 }
[y-z]+ { idtoken 12 }
[A-B]+$ ;
[C-D]+/"." ;
[E-F]+/{ tokpred } ;
^[G-H]+$ ;
^[I-J]+/"." ;
^[K-L]+/{ tokpred } ;
[M-N]+$ ;
[O-P]+/"." ;
[Q-R]+/{ tokpred } ;
[0-1]^[S-T]+$ ;
[2-3]^[U-V]+/"." ;
[4-5]^[W-X]+/{ tokpred } ;
[Y-Z]+ ;
\. ;
[ \n\t\r]+ ;
[0-9] ;
{
tokpred :: () -> AlexInput -> Int -> AlexInput -> Bool
tokpred _ _ _ _ = True
idtoken :: Read s => Int -> Strict.ByteString -> Token s
idtoken n s = Id n (read ("\"" ++ (Strict.unpack s) ++ "\""))
data Token s = Id Int s deriving Eq
lex :: Read s => Strict.ByteString -> [Token s]
lex = alexScanTokens
input = "abab\ndddc.fff\ngh\nijji.\nllmnm\noop.rq0tsst\n3uuvu.5xxw"
tokens = [ Id 0 "abab", Id 1 "dddc", Id 2 "fff", Id 3 "gh", Id 4 "ijji",
Id 5 "ll", Id 6 "mnm", Id 7 "oop", Id 8 "rq", Id 9 "tsst",
Id 10 "uuvu", Id 11 "xxw"]
main :: IO ()
main =
let
result :: [Token String]
result = lex input
in do
if result /= tokens
then exitFailure
else exitWith ExitSuccess
}
alex-3.2.5/tests/tokens.x 0000755 0000000 0000000 00000001666 07346545000 013515 0 ustar 00 0000000 0000000 {
module Main (main) where
import System.Exit
}
%wrapper "basic"
$digit=0-9 -- digits
$alpha = [a-zA-Z] -- alphabetic characters
tokens :-
$white+ ;
"--".* ;
let { \_ -> Let }
in { \_ -> In }
$digit+ { \s -> Int (read s) }
[\=\+\-\*\/\(\)] { \s -> Sym (head s) }
$alpha [$alpha $digit \_ \']* { \s -> Var s }
-- a left-context pattern for testing
^ \# ;
{
-- Each right-hand side has type :: String -> Token
-- The token type:
data Token =
Let |
In |
Sym Char |
Var String |
Int Int |
Err
deriving (Eq,Show)
main = if test1 /= result1 then exitFailure
else exitWith ExitSuccess
test1 = alexScanTokens " let in 012334\n=+*foo bar__'"
result1 = identifierWithLotsOfQuotes''
identifierWithLotsOfQuotes'' :: [Token]
identifierWithLotsOfQuotes'' =
[Let,In,Int 12334,Sym '=',Sym '+',Sym '*',Var "foo",Var "bar__'"]
}
alex-3.2.5/tests/tokens_bytestring.x 0000755 0000000 0000000 00000001663 07346545000 015764 0 ustar 00 0000000 0000000 {
{-# LANGUAGE OverloadedStrings #-}
module Main (main) where
import System.Exit
import Data.ByteString.Lazy.Char8 (unpack)
}
%wrapper "basic-bytestring"
%encoding "latin1"
$digit = 0-9 -- digits
$alpha = [a-zA-Z] -- alphabetic characters
tokens :-
$white+ ;
"--".* ;
let { \_ -> Let }
in { \_ -> In }
$digit+ { \s -> Int (read (unpack s)) }
[\=\+\-\*\/\(\)] { \s -> Sym (head (unpack s)) }
$alpha [$alpha $digit \_ \']* { \s -> Var (unpack s) }
{
-- Each right-hand side has type :: ByteString -> Token
-- The token type:
data Token =
Let |
In |
Sym Char |
Var String |
Int Int |
Err
deriving (Eq,Show)
main = if test1 /= result1 then exitFailure
else exitWith ExitSuccess
test1 = alexScanTokens " let in 012334\n=+*foo bar__'"
result1 = [Let,In,Int 12334,Sym '=',Sym '+',Sym '*',Var "foo",Var "bar__'"]
}
alex-3.2.5/tests/tokens_bytestring_unicode.x 0000755 0000000 0000000 00000002117 07346545000 017465 0 ustar 00 0000000 0000000 {
{-# LANGUAGE OverloadedStrings #-}
module Main (main) where
import System.Exit
import Data.ByteString.Lazy.Char8 (unpack)
}
%wrapper "basic-bytestring"
%encoding "utf-8"
$digit = 0-9 -- digits
$alpha = [a-zA-Zαβ] -- alphabetic characters
tokens :-
$white+ ;
"--".* ;
let { \_ -> Let }
in { \_ -> In }
$digit+ { \s -> Int (read (unpack s)) }
[\=\+\-\*\/\(\)] { \s -> Sym (head (unpack s)) }
$alpha [$alpha $digit \_ \']* { \s -> Var (unpack s) }
{
-- Each right-hand side has type :: ByteString -> Token
-- The token type:
data Token =
Let |
In |
Sym Char |
Var String |
Int Int |
Err
deriving (Eq,Show)
main = if test1 /= result1 then exitFailure
else exitWith ExitSuccess
-- \206\177\206\178\206\178 is "αββ" utf-8 encoded
test1 = alexScanTokens " let in 012334\n=+*foo \206\177\206\178\206\178 bar__'"
result1 = [Let,In,Int 12334,Sym '=',Sym '+',Sym '*',Var "foo",Var "\206\177\206\178\206\178",Var "bar__'"]
}
alex-3.2.5/tests/tokens_gscan.x 0000755 0000000 0000000 00000002072 07346545000 014660 0 ustar 00 0000000 0000000 {
module Main (main) where
import System.Exit
}
%wrapper "gscan"
$digit = 0-9 -- digits
$alpha = [a-zA-Z] -- alphabetic characters
tokens :-
$white+ ;
"--".* ;
let { tok (\p _ -> Let p) }
in { tok (\p _ -> In p) }
$digit+ { tok (\p s -> Int p (read s)) }
[\=\+\-\*\/\(\)] { tok (\p s -> Sym p (head s)) }
$alpha [$alpha $digit \_ \']* { tok (\p s -> Var p s) }
{
-- Some action helpers:
tok f p _ str len cont (sc,state) = f p (take len str) : cont (sc,state)
-- The token type:
data Token =
Let AlexPosn |
In AlexPosn |
Sym AlexPosn Char |
Var AlexPosn String |
Int AlexPosn Int |
Err AlexPosn
deriving (Eq,Show)
main = if test1 /= result1 then exitFailure
else exitWith ExitSuccess
test1 = alexGScan stop undefined " let in 012334\n=+*foo bar__'"
stop _ _ "" (_,_) = []
stop _ _ _ (_,_) = error "lexical error"
result1 = [Let (AlexPn 2 1 3),In (AlexPn 6 1 7),Int (AlexPn 9 1 10) 12334,Sym (AlexPn 16 2 1) '=',Sym (AlexPn 17 2 2) '+',Sym (AlexPn 18 2 3) '*',Var (AlexPn 19 2 4) "foo",Var (AlexPn 23 2 8) "bar__'"]
}
alex-3.2.5/tests/tokens_monadUserState_bytestring.x 0000755 0000000 0000000 00000003253 07346545000 020777 0 ustar 00 0000000 0000000 {
{-# LANGUAGE OverloadedStrings #-}
module Main (main) where
import System.Exit
import qualified Data.ByteString.Lazy.Char8 as B
}
%wrapper "monadUserState-bytestring"
%encoding "iso-8859-1"
$digit = 0-9 -- digits
$alpha = [a-zA-Z] -- alphabetic characters
tokens :-
$white+ ;
"--".* ;
let { tok (\p _ -> Let p) }
in { tok (\p _ -> In p) }
$digit+ { tok (\p s -> Int p (read (B.unpack s))) }
[\=\+\-\*\/\(\)] { tok (\p s -> Sym p (head (B.unpack s))) }
$alpha [$alpha $digit \_ \']* { tok (\p s -> Var p (B.unpack s)) }
{
-- Each right-hand side has type :: AlexPosn -> String -> Token
-- Some action helpers:
tok f (p,_,input,_) len = return (f p (B.take (fromIntegral len) input))
-- The token type:
data Token =
Let AlexPosn |
In AlexPosn |
Sym AlexPosn Char |
Var AlexPosn String |
Int AlexPosn Int |
Err AlexPosn |
EOF
deriving (Eq,Show)
alexEOF = return EOF
main = if test1 /= result1 then do print test1; exitFailure
else exitWith ExitSuccess
type AlexUserState = ()
alexInitUserState = ()
scanner str = runAlex str $ do
let loop = do tk <- alexMonadScan
if tk == EOF
then return [tk]
else do toks <- loop
return (tk:toks)
loop
test1 = case scanner " let in 012334\n=+*foo bar__'" of
Left err -> error err
Right toks -> toks
result1 = [Let (AlexPn 2 1 3),In (AlexPn 6 1 7),Int (AlexPn 9 1 10) 12334,Sym (AlexPn 16 2 1) '=',Sym (AlexPn 17 2 2) '+',Sym (AlexPn 18 2 3) '*',Var (AlexPn 19 2 4) "foo",Var (AlexPn 23 2 8) "bar__'", EOF]
}
alex-3.2.5/tests/tokens_monad_bytestring.x 0000755 0000000 0000000 00000003156 07346545000 017141 0 ustar 00 0000000 0000000 {
{-# LANGUAGE OverloadedStrings #-}
module Main (main) where
import System.Exit
import qualified Data.ByteString.Lazy.Char8 as B
}
%wrapper "monad-bytestring"
%encoding "Latin1"
$digit = 0-9 -- digits
$alpha = [a-zA-Z] -- alphabetic characters
tokens :-
$white+ ;
"--".* ;
let { tok (\p _ -> Let p) }
in { tok (\p _ -> In p) }
$digit+ { tok (\p s -> Int p (read (B.unpack s))) }
[\=\+\-\*\/\(\)] { tok (\p s -> Sym p (head (B.unpack s))) }
$alpha [$alpha $digit \_ \']* { tok (\p s -> Var p (B.unpack s)) }
{
-- Each right-hand side has type :: AlexPosn -> String -> Token
-- Some action helpers:
tok f (p,_,input,_) len = return (f p (B.take (fromIntegral len) input))
-- The token type:
data Token =
Let AlexPosn |
In AlexPosn |
Sym AlexPosn Char |
Var AlexPosn String |
Int AlexPosn Int |
Err AlexPosn |
EOF
deriving (Eq,Show)
alexEOF = return EOF
main = if test1 /= result1 then do print test1; exitFailure
else exitWith ExitSuccess
scanner str = runAlex str $ do
let loop = do tk <- alexMonadScan
if tk == EOF
then return [tk]
else do toks <- loop
return (tk:toks)
loop
test1 = case scanner " let in 012334\n=+*foo bar__'" of
Left err -> error err
Right toks -> toks
result1 = [Let (AlexPn 2 1 3),In (AlexPn 6 1 7),Int (AlexPn 9 1 10) 12334,Sym (AlexPn 16 2 1) '=',Sym (AlexPn 17 2 2) '+',Sym (AlexPn 18 2 3) '*',Var (AlexPn 19 2 4) "foo",Var (AlexPn 23 2 8) "bar__'", EOF]
}
alex-3.2.5/tests/tokens_posn.x 0000755 0000000 0000000 00000001770 07346545000 014550 0 ustar 00 0000000 0000000 {
module Main (main) where
import System.Exit
}
%wrapper "posn"
$digit = 0-9 -- digits
$alpha = [a-zA-Z] -- alphabetic characters
tokens :-
$white+ ;
"--".* ;
let { tok (\p _ -> Let p) }
in { tok (\p _ -> In p) }
$digit+ { tok (\p s -> Int p (read s)) }
[\=\+\-\*\/\(\)] { tok (\p s -> Sym p (head s)) }
$alpha [$alpha $digit \_ \']* { tok (\p s -> Var p s) }
{
-- Each right-hand side has type :: AlexPosn -> String -> Token
-- Some action helpers:
tok f p s = f p s
-- The token type:
data Token =
Let AlexPosn |
In AlexPosn |
Sym AlexPosn Char |
Var AlexPosn String |
Int AlexPosn Int |
Err AlexPosn
deriving (Eq,Show)
main = if test1 /= result1 then exitFailure
else exitWith ExitSuccess
test1 = alexScanTokens " let in 012334\n=+*foo bar__'"
result1 = [Let (AlexPn 2 1 3),In (AlexPn 6 1 7),Int (AlexPn 9 1 10) 12334,Sym (AlexPn 16 2 1) '=',Sym (AlexPn 17 2 2) '+',Sym (AlexPn 18 2 3) '*',Var (AlexPn 19 2 4) "foo",Var (AlexPn 23 2 8) "bar__'"]
}
alex-3.2.5/tests/tokens_posn_bytestring.x 0000755 0000000 0000000 00000002275 07346545000 017023 0 ustar 00 0000000 0000000 {
{-# LANGUAGE OverloadedStrings #-}
module Main (main) where
import System.Exit
import Data.ByteString.Lazy.Char8 (unpack)
}
%wrapper "posn-bytestring"
%encoding "UTF8"
$digit = 0-9 -- digits
$alpha = [a-zA-Z] -- alphabetic characters
tokens :-
$white+ ;
"--".* ;
let { tok (\p _ -> Let p) }
in { tok (\p _ -> In p) }
$digit+ { tok (\p s -> Int p (read (unpack s))) }
[\=\+\-\*\/\(\)] { tok (\p s -> Sym p (head (unpack s))) }
$alpha [$alpha $digit \_ \']* { tok (\p s -> Var p (unpack s)) }
{
-- Each right-hand side has type :: AlexPosn -> String -> Token
-- Some action helpers:
tok f p s = f p s
-- The token type:
data Token =
Let AlexPosn |
In AlexPosn |
Sym AlexPosn Char |
Var AlexPosn String |
Int AlexPosn Int |
Err AlexPosn
deriving (Eq,Show)
main = if test1 /= result1 then exitFailure
else exitWith ExitSuccess
test1 = alexScanTokens " let in 012334\n=+*foo bar__'"
result1 = [Let (AlexPn 2 1 3),In (AlexPn 6 1 7),Int (AlexPn 9 1 10) 12334,Sym (AlexPn 16 2 1) '=',Sym (AlexPn 17 2 2) '+',Sym (AlexPn 18 2 3) '*',Var (AlexPn 19 2 4) "foo",Var (AlexPn 23 2 8) "bar__'"]
}
alex-3.2.5/tests/tokens_scan_user.x 0000755 0000000 0000000 00000002412 07346545000 015545 0 ustar 00 0000000 0000000 {
module Main (main) where
import System.Exit
}
%wrapper "basic" -- Defines: AlexInput, alexGetByte, alexPrevChar
$digit = 0-9
$alpha = [a-zA-Z]
$ws = [\ \t\n]
tokens :-
5 / {\ u _ibt _l _iat -> u == FiveIsMagic} { \s -> TFive (head s) }
$digit { \s -> TDigit (head s) }
$alpha { \s -> TAlpha (head s) }
$ws { \s -> TWSpace (head s) }
{
data Token = TDigit Char
| TAlpha Char
| TWSpace Char
| TFive Char -- Predicated only
| TLexError
deriving (Eq,Show)
data UserLexerMode = NormalMode
| FiveIsMagic
deriving Eq
main | test1 /= result1 = exitFailure
| test2 /= result2 = exitFailure
-- all succeeded
| otherwise = exitWith ExitSuccess
run_lexer :: UserLexerMode -> String -> [Token]
run_lexer m s = go ('\n', [], s)
where go i@(_,_,s') = case alexScanUser m i 0 of
AlexEOF -> []
AlexError _i -> [TLexError]
AlexSkip i' _len -> go i'
AlexToken i' len t -> t (take len s') : go i'
test1 = run_lexer FiveIsMagic "5 x"
result1 = [TFive '5',TWSpace ' ',TAlpha 'x']
test2 = run_lexer NormalMode "5 x"
result2 = [TDigit '5',TWSpace ' ',TAlpha 'x']
}
alex-3.2.5/tests/tokens_strict_bytestring.x 0000755 0000000 0000000 00000001663 07346545000 017354 0 ustar 00 0000000 0000000 {
{-# LANGUAGE OverloadedStrings #-}
module Main (main) where
import System.Exit
import Data.ByteString.Char8 (unpack)
}
%wrapper "strict-bytestring"
%encoding "ISO-8859-1"
$digit = 0-9 -- digits
$alpha = [a-zA-Z] -- alphabetic characters
tokens :-
$white+ ;
"--".* ;
let { \_ -> Let }
in { \_ -> In }
$digit+ { \s -> Int (read (unpack s)) }
[\=\+\-\*\/\(\)] { \s -> Sym (head (unpack s)) }
$alpha [$alpha $digit \_ \']* { \s -> Var (unpack s) }
{
-- Each right-hand side has type :: ByteString -> Token
-- The token type:
data Token =
Let |
In |
Sym Char |
Var String |
Int Int |
Err
deriving (Eq,Show)
main = if test1 /= result1 then exitFailure
else exitWith ExitSuccess
test1 = alexScanTokens " let in 012334\n=+*foo bar__'"
result1 = [Let,In,Int 12334,Sym '=',Sym '+',Sym '*',Var "foo",Var "bar__'"]
}
alex-3.2.5/tests/unicode.x 0000755 0000000 0000000 00000003045 07346545000 013631 0 ustar 00 0000000 0000000 {
-- Tests the basic operation.
module Main where
import Data.Char (toUpper)
import Control.Monad
import System.Exit
import System.IO
}
%wrapper "monad"
@word = [A-Za-z]+
tokens :-
<0> {
"αω" { string }
[AΓ] { character }
. { other }
}
{
string :: AlexInput -> Int -> Alex String
string (_,_,_,_) _ = return "string!"
other :: AlexInput -> Int -> Alex String
other (_,_,_,input) len = return (take len input)
character :: AlexInput -> Int -> Alex String
character (_,_,_,_) _ = return "PING!"
alexEOF :: Alex String
alexEOF = return "stopped."
scanner :: String -> Either String [String]
scanner str = runAlex str $ do
let loop = do tok <- alexMonadScan
if tok == "stopped." || tok == "error."
then return [tok]
else do toks <- loop
return (tok:toks)
loop
main :: IO ()
main = do
let test1 = scanner str1
when (test1 /= out1) $
do hPutStrLn stderr "Test 1 failed:"
print test1
exitFailure
let test2 = scanner str2
when (test2 /= out2) $
do hPutStrLn stderr "Test 2 failed:"
print test2
exitFailure
let test3 = scanner str3
when (test3 /= out3) $
do hPutStrLn stderr "Test 3 failed:"
print test3
exitFailure
let test4 = scanner str4
when (test4 /= out4) $
do hPutStrLn stderr "Test 4 failed:"
print test4
exitFailure
str1 = "A."
out1 = Right ["PING!",".","stopped."]
str2 = "\n"
out2 = Left "lexical error at line 1, column 1"
str3 = "αω --"
out3 = Right ["string!"," ","-","-","stopped."]
str4 = "βΓ"
out4 = Right ["β","PING!","stopped."]
}