Yes, it is dangerous because hashes can be made of any file - not just of child porn, but of pictures of demonstrations, audio files of speeches, memes, political manifestos, pictures of yellow and blue flags, etc. Apple gave the world an algorithmic blueprint that could be used by authoritarian regimes to detect all sorts of material while protecting 'privacy'. It was incredibly naive and negligent of Apple - the end product of letting engineers run amok without due consideration of the impacts of the work.
Nothing Apple did was new. Everything they used was a known technology having been developed by others before them.
The frontrunners on these types of technology have been Google and Microsoft. In fact, Microsoft has donated a CSAM-detection solution which is available to other tech companies on a case-by-case basis.