Are there any writings on the relationship between lexicography, discourse analysis, and text/data mining? As I continue work on a keyword-specific project, I'm wondering where the theoretical, historical, and methodological overlaps are in these various ways of mapping epistemology through language.
There are tons of questions one can ask between these fields, but I'm thinking in particular about whether something like the peaks and valleys in a frequency analysis constitute the rise and fall of a Foucauldian "discursive regularity." I recently saw Bernard Geoghegan's piece in Critical Inquiry on enthusiasm for cybernetics research among French structuralists (a history he argues is important to visit in light of today's digital revolutions in the humanities), but it raises more questions than answers, wonderful as they are.
Any leads would be much appreciated!