Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't really see Abstract Algebra as a super relevant topic for modern computer science. I agree on Linear Algebra and basic abstract algebra (like group theory, cyclic groups etc, stuff needed for understanding topics like RSA), but stuff like Galois theory and other advanced topics are just not relevant. They still teach you a different way to think about problems (which is good) but rather belong into a math curriculum than a CS one.


I've gotten a lot of use out of group theory and lattice theory, and particularly semigroups and semilattices. What most programmers don't realize is that design patterns (a la Gang of Four), when they aren't working around a language's quirks, are implementing some algebra.


Could you give us some pointers on where to learn more about this connection between design patterns and abstract algebra?


You know, I don't have any. It's something that I've thought about writing about for years now, but haven't gotten around to doing, and I'm unaware of anything else written about it.


Galois (finite) fields are very much relevant in error correction codes and cryptography. Any time you want to manipulate bytes (or words) as numbers which you can add/subtract and multiply/divide you end up learning about finite fields. In fact, doing CRC or even XOR sums you are already making the first step towards the topic.

A recent problem I needed a solution for: given n blocks of data of equal length generate all possible XOR sums of the blocks in-place (no additional memory). The answer is an irreducible polynomial of degree n with a primitive root over GF(2).


Most people aren't building crypto, though (at least they really shouldn't be).

And maybe the answer found the problem you're describing, not the other way around. You were taught to look for and solve problems a certain way, so problems appear with those characteristics.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: