Skip to content

Commit

Permalink
Tweak tokenise.
Browse files Browse the repository at this point in the history
  • Loading branch information
tristanmorgan committed Mar 26, 2024
1 parent d1ac3b5 commit a1e1054
Show file tree
Hide file tree
Showing 2 changed files with 33 additions and 14 deletions.
30 changes: 24 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,16 +6,34 @@ BFG is an optimised [Brainfuck](https://esolangs.org/wiki/Brainfuck) interpreter

Uses signed ints for data (platform specific 32/64), memory wraps around at 65535, EOF returns -1.

Buffered output flushes on newline, 200 chars or input.

## Optimisations

[-] is replaced with a blind set 0 command
Operates with a instruction parse then execute pattern.

* loop start/end are calculated up front.
* [-] is replaced with a blind set 0 command
* repeat ++++ or --- are replaced with a single addition/subtraction
* addition/subtraction after zero does a blind set.
* repeat >>> or <<< are replaced with a single pointer jump
* [>>>] and [<<<] are merged into a skip instruction.
* [>>+<<-] and [->>+<<] merged into a move instruction.

for performance comparison see no_optimisation branch.

repeat ++++ or --- are replaced with a single addition/subtraction
## Usage

addition/subtraction after zero does a blind set.
Usage:
bf [option] source.bf [input]

Options:
-version
display version

repeat >>> or <<< are replaced with a single pointer jump
May use - as source to read program from STDIN and output is STDOUT

[>>+<<-] and [->>+<<] merged into a move instruction.
+++++++++[>++++++++>++++++++++++>++++>++++++++++++>+++++++++++>+<<<<<<-]>---.>++
.----.+++++.++++++++++.>----.<-----.>>----.+.<<-.>.<<---.>-.>>>--.<.+++++.<<<+++
+.>+++.>>>++.<---.<+.>>>+.

buffered output prints on newline, 200 chars or input.
17 changes: 9 additions & 8 deletions parser/tokenise.go
Original file line number Diff line number Diff line change
Expand Up @@ -18,28 +18,29 @@ func Tokenise(input io.ByteReader) (program []Instruction, err error) {
} else if err != nil {
return nil, errors.New("tokenisation read error")
}
program = append(program, NewInstruction(chr))
switch program[pc].operator {
instruction := NewInstruction(chr)
program = append(program, instruction)
switch instruction.operator {
case opNoop:
program = program[:pc]
pc--
case opAddDp:
if program[pc-1].SameOp(program[pc]) {
program[pc-1].operand += program[pc].operand
if program[pc-1].SameOp(instruction) {
program[pc-1].operand += instruction.operand
program = program[:pc]
pc--
}
case opAddVal:
if program[pc-1].SameOp(program[pc]) {
program[pc-1].operand += program[pc].operand
if program[pc-1].SameOp(instruction) {
program[pc-1].operand += instruction.operand
program = program[:pc]
pc--
} else if program[pc-1].operator == opSetVal {
program[pc-1].operand += program[pc].operand
program[pc-1].operand += instruction.operand
program = program[:pc]
pc--
} else if program[pc-1].operator == opJmpNz {
operand := program[pc].operand
operand := instruction.operand
program = program[:pc]
program = append(program, Instruction{opSetVal, operand})
}
Expand Down

0 comments on commit a1e1054

Please sign in to comment.