Limits on memory for pathfinding algorithms vs lookup tables
I'm writing a C code generator geared toward RobotC and complex tasks for an FTC team, and was wondering about some performance and storage concerns:
- How much memory is available for my program's data? It'll be mostly pre-defined lookup tables, generally in the form of multidimensional arrays.
- How much NXT memory is available for my program itself? As in, roughly how much code can I expect to fit into a single RobotC compiled program?
- How quickly do programs execute, generally? Looking at disassembly most of my generated lines correspond to 2-4 opcodes.
I'm using NXT/Tetrix. My major interest at this point with these questions is for pathfinding. I plan to have a 64x64 grid and be running Djisktra's A* algorithm with a heuristic function that assigns a penalty to turns and is as close to consistent as possible (not sure if consistency/monotonicity is doable with the turn penalty).
Roughly 8 paths would be cached if I decide to use the pre-cached lookup tables.
Instead of a set, I'll probably use a boolean array for the set of nodes visited. The fact that I'm working with a square layout will allow me to use a 2D array for the map needed to reconstruct the path.
I'd love some feedback and answers to my question if anyone has any. Thanks!