You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The code I have for handling comments is getting hairy because I forgot about single-ticks within double-quotes. For example,
6015IF WA=0THENPRINT"Error: File '";WF$;"' File not found.": END
That line was getting cut off at the first single tick after the word "File". In the meantime, I've kludged the script to simply pass through any lines with double quotes.
The proper solution would be to write a lexer using flex. Technically, a Context-Free Grammar is equivalent to a regular expression, however, it should be a lot easier to understand and debug.
The text was updated successfully, but these errors were encountered:
I did the "proper solution", mentioned above. Or at least, I think I did. It was back in 2022 so I've forgotten if there were severe hiccups. It should work on MacOS with a simple make install.
The current version of https://github.com/hackerb9/tokenize contains a program called "tandy-decomment" which, while not complete, does remove most the comments. Typically one would use the frontend "tokenize -d" to decomment and tokenize at the same time.
I replaced adjunct/removecomments in the linked pull request with adjunct/tandy-decomment which should run fine on any system with a C compiler (MacOS, Linux, Whathaveyou...) Merging #55 will close this issue.
The code I have for handling comments is getting hairy because I forgot about single-ticks within double-quotes. For example,
That line was getting cut off at the first single tick after the word "File". In the meantime, I've kludged the script to simply pass through any lines with double quotes.
The proper solution would be to write a lexer using flex. Technically, a Context-Free Grammar is equivalent to a regular expression, however, it should be a lot easier to understand and debug.
The text was updated successfully, but these errors were encountered: