Search Tokens Enterprise
Search for tokens using a Lucene-based query syntax.
Permissions
token:search
Request
- cURL
- JavaScript
- C#
- Java
- Python
- Go
curl "https://api.basistheory.com/tokens/search" \
-H "BT-API-KEY: <PRIVATE_API_KEY>" \
-H "Content-Type: application/json" \
-X "POST" \
-d '{
"query": "data:6789",
"page": 1,
"size": 20
}'
import { BasisTheory } from "@basis-theory/basis-theory-js";
const bt = await new BasisTheory().init("<PRIVATE_API_KEY>");
const tokens = await bt.tokens.search({
query: "data:6789",
page: 1,
size: 20,
});
using BasisTheory.net.Tokens;
var client = new TokenClient("<PRIVATE_API_KEY>");
var tokens = await client.SearchAsync(new TokenSearchRequest {
Query = "data:6789",
Page = 1,
PageSize = 20
});
import com.basistheory.*;
import com.basistheory.auth.*;
public class Example {
public static void main(String[] args) throws Exception {
ApiClient defaultClient = Configuration.getDefaultApiClient();
defaultClient.setBasePath("https://api.basistheory.com");
ApiKeyAuth ApiKey = (ApiKeyAuth) defaultClient.getAuthentication("ApiKey");
ApiKey.setApiKey("<PRIVATE_API_KEY>");
TokensApi apiInstance = new TokensApi(defaultClient);
SearchTokensRequest searchTokensRequest = new SearchTokensRequest(); // SearchTokensRequest |
TokenPaginatedList result = apiInstance.search(searchTokensRequest
.query("data:6789")
.page(1)
.size(20));
}
}
import basistheory
from basistheory.api import tokens_api
from basistheory.model.search_tokens_request import SearchTokensRequest
with basistheory.ApiClient(configuration=basistheory.Configuration(api_key="<PRIVATE_API_KEY>")) as api_client:
token_client = tokens_api.TokensApi(api_client)
tokens = token_client.search(search_tokens_request=SearchTokensRequest(
query="data:6789",
page=1,
size=20
))
package main
import (
"context"
"github.com/Basis-Theory/basistheory-go/v3"
)
func main() {
configuration := basistheory.NewConfiguration()
apiClient := basistheory.NewAPIClient(configuration)
contextWithAPIKey := context.WithValue(context.Background(), basistheory.ContextAPIKeys, map[string]basistheory.APIKey{
"ApiKey": {Key: "<PRIVATE_API_KEY>"},
})
searchTokenRequest := *basistheory.NewSearchTokensRequest()
searchTokenRequest.SetQuery("data:6789")
searchTokenRequest.SetPage(1)
searchTokenRequest.SetSize(20)
tokens, httpResponse, err := apiClient.TokensApi.Search(contextWithAPIKey).SearchTokensRequest(searchTokenRequest).Execute()
}
Request Parameters
Attribute | Required | Type | Default | Description |
---|---|---|---|---|
query | false | string | null | A query string using Lucene query syntax. |
page | false | integer | 1 | Page number of the results to return. |
size | false | integer | 20 | Number of results per page to return. Maximum size of 100 results. |
Response
Returns a paginated object with the data
property containing an array of tokens.
Token data will be returned according to the transform
applied to the requesting Application's Access Controls.
Returns an error if tokens could not be retrieved or when an invalid query is submitted.
Token data will be returned in search results according to the transform applied within the requesting Application's Access Controls.
{
"pagination": {...},
"data": [
{
"id": "c06d0789-0a38-40be-b7cc-c28a718f76f1",
"type": "social_security_number",
"tenant_id": "77cb0024-123e-41a8-8ff8-a3d5a0fa8a08",
"data": "XXX-XX-6789",
"fingerprint": "AKCUXS83DokKo4pDRKSAy4d42t9i8dcP1X2jijwEBCQH",
"containers": ["/pii/high/"],
"metadata": {
"nonSensitiveField": "Non-Sensitive Value"
},
"search_indexes": [
"{{ data }}",
"{{ data | replace: '-' }}",
"{{ data | last4 }}"
],
"fingerprint_expression": "{{ data }}",
"created_by": "fb124bba-f90d-45f0-9a59-5edca27b3b4a",
"created_at": "2021-03-01T08:23:14+00:00"
},
{...},
{...}
]
}
Query Syntax
Token search supports a Lucene-based query syntax.
A query string is comprised of one or more Terms that can be combined with AND
Operators.
Search terms are formed by combining a field name and a value to search with a :
- field:value
.
See the Searchable Token Fields table below for a complete list of supported fields.
Token data may be searched on search indexed tokens by performing a case-sensitive exact match to one of several indexed data patterns.
For example, the following query will search for indexed tokens containing the data 123-45-6789
:
data:123-45-6789
Phrases or values containing spaces may be searched by wrapping the searched value in quotes, for example:
data:"data containing multiple words"
For more detailed information about supported data searches, see Searching Data.
Metadata search terms require both a key and value to be specified in the form of metadata.key:value
.
Metadata will be searched for a case-insensitive, exact match.
For example, to search for tokens having the metadata { customer_id: "cus_12345" }
, query for:
metadata.customer_id:cus_12345
Multiple metadata
terms may be combined using the AND
operator. For example:
metadata.customer_id:my_customer AND metadata.user_id:1234
Only a subset of the full Lucene query syntax is currently supported, and it is limited to the operators and terms documented above. Combining terms for multiple token fields in the same query is not currently supported. If you would like to have support for any additional Lucene features or to query on additional token fields, please let us know.
Searchable Token Fields
Fields | Type | Description | Example |
---|---|---|---|
id | string | Token ID. | id:fe24d4cc-de50-4d8c-8da7-8c7483ba21bf |
data | string | Token data. See Searching Data for supported values. | data:6789 |
metadata.[key] | string | Search against token metadata having the given [key] . | metadata.user_id:34445 |
Searching Data
Basis Theory supports search across sensitive token data by securely creating and storing search indexes at the time of token creation.
To index a token for data search, one or more search_indexes
can be specified in the Create Token Request or Tokenize Request.
Some token types will receive a default set of search indexes if custom search_indexes
are not specified on the request.
The search_indexes
property supports the use of expressions, which are based on the Liquid templating language. Each expression must result in a single value, which cannot be null or empty, otherwise a 400 error will be returned.
Any expressions contained within search_indexes
will be evaluated against the token data before generating indexes. Token data searches will only return a token if there is an exact match on one of the evaluated search_indexes
.
For example, if a card
token has the number 4242424242424242
and it is created with the search index expressions:
{{ data.number }}
{{ data.number | last4 }}
Then you may search for this token by querying on either the full card number:
data:4242424242424242
or by querying on the last 4 digits:
data:4242