I am creating a basic tile-based 2D game ~mostly from scratch in Java, however all I need is pseudo-code for how this could be acheived. My problem lies in the fact that my world (stored in a HashMap so I don’t have to have a null object for every position down to 0) may be millions or even billions of tiles large, so I’m sure it is not efficient to loop through every tile. A solution I have thought of to solve this issue would be to try to calculate based on the position of the camera in the world and the size of the tiles how many could fit on the screen in the X and Y axes, and also the offset of them to the screen as well, but I have tried (and failed) to implement this, because I am unsure on how to convert this is Maths/Logic. How may I go about this? Below I have linked an image showing what tiles I intend to show to the screen, where the black square is the centre and tiles that do not cover the screen are not shown. Thanks for your help in advance!
- Exactly how average speed cameras work and how to avoid a fine
- A14 roadworks: 14 average speed camera myths busted
- 14 myths about average speed cameras that just aren't true
- 14 myths around average speed cameras debunked
- The truth about average speed cameras
- 14 average speed camera myths busted
- The camera van you need to look out for at EMA petrol station
- Drivers need to watch out for this camera van parked at airport
- 'Yellow vulture' traffic cameras can catch you on your phone
- New road cameras zoom in to cars to catch drivers using phones
How can I calculate all the tiles visible to a camera in 2D? have 296 words, post on gamedev.stackexchange.com at August 15, 2019. This is cached page on GameMax. If you want remove this page, please contact us.