A “bit” is a basic unit of information entropy. It’s binary, either on or off, present or absent, one or zero.
A “string” in computer programming is a sequence of items of a particular length. They may be fixed or variable length. Eight, sixteen, thirty-two and sixty-four bit numbers are fixed length. A text string is variable length.
A byte is a series of eight bits that’s used as a standard representation for typographic characters, colour values and many other things. Up until IBM’s OS/360 project in the late 1960s there was no real standard for this - computers might be decimal, or alphabetic, or have “words” of sizes from four to twenty-four bits. Some Soviet computers of the same period used ternary logic rather than binary. Alan Turing used a logarithmic measure of information entropy called a “ban”. So be wary of naturalising the bit and the eight-bit byte, but when you see bits grouped together in strings of lengths that divide neatly into eight, recognise that this is related to the reality of how most modern computer sytems divide up their memory.)
Bitstrings can be used to represent the presence or absence of properties. A fixed-length bitstring is a bitfield, but we’re going to stick with the more general name. Integer numbers represented in binary use bits to represent the presence or absence of quantities of increasing sizes within the number. 0110 is six in a four bit “nibble”. UNIX filesystems represent the permissions that the owner and other users of a file have to access and manipulate it as a sequence of bits.
Such bitfields can be found throughout computing. The satirical proposal for an “evil bit” to be set on Internet messages that have evil intent, shows both the prevalence of bitstrings and their users awareness of the limitations of binary thinking and computational representation.
As with their use to represent integer numbers using binary, bits can represent doubling or halving of quantities. It takes 33 bits of entropy to uniquely identify an individual among seven billion on Earth. Cryptographic hashes, which produce compact unique “names” for any input file of any length, often output 128, 160 or 256 bit values. Each bit doubles the possible size, quantity, or uniqueness of the thing it represents. It also doubles the size of the space in which it can hide.
Contemporary cryptographic encoding and signing systems use keys several thousand bits in length. They would take a conventional computer an infeasable amount of time to break. This property is used in Bitcoin mining to create cryptographic puzzles that require capital outlay to solve.
A proposal for “vectored signatures” for the “V” version control system uses features of these different strings of bits. It represents assertions about an individual’s relationship to and opinion of a piece of code using a bitstring. It asserts the identity of that individual using cryptographic signatures. This combination is a generalization of cryptographic “keysigning” as recognition of identity, and the fact that Bitcoin transactions involve cryptographic signatures of communications between individuals about single-dimensional (monetary) quantities.
The bitstring representation of logical operators developed by the Logical Geometry project provides a compact and information-rich notation for various logics. Each bit represents a fact about an operator such as “true in all possible worlds”, and relates to geometric and trellis representations of the same operators. Bitwise operations on these representations are meaningful - for example bitwise NOT on p (1100) gives ¬p (0011).
The combination of logically manipulable bitstring representations (as with Logical Geometry) asserted through cryptographic signatures (as with vectored signatures) seems like a possibly fruitful area of investigation.