Coming from a PC enthusiast background, 32-bit PCI seemed perfectly serviceable, with 64-bit only ever seen in exotic server configurations.
What benefit did Apple see in adopting 64-bit PCI in these systems, what kinds of cards actually took advantage of this technology, and why was it never a thing in mainstream PC computing?
If I recall correctly, the PCI bus was shared and not point to point like PCIe, so if you had multiple cards it made sense to buy high end 64 bit cards for that reason.
But lack of useful desktop/workstation hardware and adoption of PCIe on PCs killed it.