CETT Functions
Low-level CETT extraction and causal intervention utilities. These are used internally by HProbes but are also available for custom pipelines.
hprobes.cett.available_layers(model)
hprobes.cett.precompute_col_norms(model, layers)
Precompute ‖W_down[:, j]‖₂ for each layer.
Returns dict mapping layer_idx → (intermediate_dim,) tensor of column norms. Computed once and reused across all samples.
Source code in src/hprobes/cett.py
hprobes.cett.forward_cett(model, tokens, layers, col_norms, token_position=-1)
Single forward pass — extract CETT at a given token position.
Parameters
model : causal LM tokens : tokenizer output (input_ids, attention_mask on correct device) layers : list of layer indices to hook col_norms : precomputed column norms from precompute_col_norms() token_position : which token to extract CETT at (-1 = last token)
Returns
cett_vec : (n_layers * intermediate_dim,) float32 — concatenated CETT values logits : (vocab_size,) float32 — output logits at the last token
Source code in src/hprobes/cett.py
hprobes.cett.forward_cett_at_token(model, tokens, extra_token_id, layers, col_norms)
Append one token to the input and capture CETT at that appended position.
Returns
cett_answer : (n_layers * intermediate_dim,) float32
Source code in src/hprobes/cett.py
hprobes.cett.forward_cett_span(model, tokens, span_start, span_end, layers, col_norms, aggregation='mean')
Forward pass over a full sequence — extract CETT aggregated over a token span.
Source code in src/hprobes/cett.py
hprobes.cett.forward_cett_batch(model, batch_tokens, layers, col_norms, token_positions)
Batched forward pass — extract CETT for each sample.
Source code in src/hprobes/cett.py
hprobes.cett.forward_cett_at_token_batch(model, batch_tokens, extra_token_ids, layers, col_norms)
Batched version of forward_cett_at_token.
Source code in src/hprobes/cett.py
hprobes.cett.scale_h_neurons(model, tokens, h_neurons, alpha, layers)
Forward pass scaling H-Neuron activations by alpha.