If you’ve ever pointed a search tool at a giant dataset on a NAS and watched it spray indexes into your user profile… this one’s for you. Here’s a clean, repeatable way to map a Synology share, tell qgrep
to keep its index in a specific folder on that share, and then search it—fast.
Use case: I’ve got breach data under
breach_data\china
on a Synology mapped toZ:
. I want the index next to the data, inZ:\qgrep_china_index
, not underC:\Users\<me>\.qgrep
.
TL;DR (commands you can paste)
REM 1) One-time init: config/index lives in Z:\qgrep_china_index
qgrep init "Z:\qgrep_china_index\china.cfg" "Z:\breach_data\china"
REM 2) Build/update the index
qgrep update "Z:\qgrep_china_index\china.cfg"
REM 3) Search it (options are letters, not dashes)
qgrep search "Z:\qgrep_china_index\china.cfg" i "password"
qgrep search "Z:\qgrep_china_index\china.cfg" il "example@domain.com"
qgrep search "Z:\qgrep_china_index\china.cfg" Vi "credential leak"
i
→ case-insensitivel
→ literal substring (no regex)V
→ Visual Studio‐style formatting (nice in many terminals/editors)
1) Map the Synology share (Windows)
- In File Explorer → right‑click This PC → Map network drive…
- Pick a letter (I’ll use
Z:
). - Folder:
\\10.0.0.130\<YourSharedFolder>
(for example,\\10.0.0.130\data
) - Check Reconnect at sign‑in, authenticate with your Synology account.
If you’re on macOS: Finder → Go → Connect to Server… →
smb://10.0.0.130/<YourSharedFolder>
→ Connect. The rest of this post still applies; just swapZ:
paths for your mount point.
2) Why put the index next to the data?
- Portability: Move/copy/share the dataset + index as a unit.
- Clean home folder: No sprawling
.qgrep
under your profile. - Multi‑machine: Any analyst who mounts the share can reuse the index.
Create the index folder once:
mkdir "Z:\qgrep_china_index"
3) Initialize a qgrep
project with a custom location
qgrep
expects either a project name (stored under your home) or a path to a .cfg
file. We’ll use the second form so the config and index live on Z:
:
qgrep init "Z:\qgrep_china_index\china.cfg" "Z:\breach_data\china"
What this does:
- Writes
china.cfg
underZ:\qgrep_china_index
- Associates that config with the source tree
Z:\breach_data\china
- Subsequent
update
/search
calls use that.cfg
path
Tip: If you accidentally ran
qgrep init china "..."
earlier, that made a project under your home. No problem—just use the.cfg
form above going forward.
4) Build (and refresh) the index
qgrep update "Z:\qgrep_china_index\china.cfg"
- First run: full index (time depends on dataset + network).
- Later runs: incremental—fast. Make it a habit after you add new files.
5) Search patterns that actually help
A few real‑world examples you’ll likely use on breach data:
REM Case-insensitive keyword
qgrep search "Z:\qgrep_china_index\china.cfg" i "password"
REM Literal email (avoid regex surprises)
qgrep search "Z:\qgrep_china_index\china.cfg" il "example@domain.com"
REM Multiple tokens: use regex OR literal wisely
qgrep search "Z:\qgrep_china_index\china.cfg" i "(apikey|token|bearer)"
REM CSV-ish hunting: literal comma-separated field labels
qgrep search "Z:\qgrep_china_index\china.cfg" il "username,password"
REM Nice formatting for editor jumps
qgrep search "Z:\qgrep_china_index\china.cfg" Vi "credential leak"
Heads‑up: The search options are letters without dashes (e.g.,
i
,l
,V
). If you type-i
out of habit,qgrep
will ignore it.
6) One‑liner batch to set up and test
Save this as qgrep_china_setup_and_search.bat
and run it from any shell:
@echo off
set CFG=Z:\qgrep_china_index\china.cfg
set DATA=Z:\breach_data\china
set TEST=il "password"
if not exist "Z:\qgrep_china_index" mkdir "Z:\qgrep_china_index"
REM Initialize only if it doesn't already exist
if not exist "%CFG%" (
echo Initializing project at %CFG% for %DATA%...
qgrep init "%CFG%" "%DATA%"
)
echo Updating index...
qgrep update "%CFG%"
echo.
echo Running test search (%TEST%)...
qgrep search "%CFG%" %TEST%
echo.
echo Done.
PowerShell flavor:
$cfg = 'Z:\qgrep_china_index\china.cfg'
$data = 'Z:\breach_data\china'
if (-not (Test-Path 'Z:\qgrep_china_index')) { New-Item -ItemType Directory 'Z:\qgrep_china_index' | Out-Null }
if (-not (Test-Path $cfg)) { qgrep init $cfg $data }
qgrep update $cfg
qgrep search $cfg il 'password'
7) Performance + sanity notes
- Index once, search many times: Most of your cost is the initial pass.
- Network throughput matters: Indexing over Wi‑Fi on a big share will feel slow. If practical, do it on wired.
- Permissions: If you see access errors, make sure the Synology share account can read all subfolders (and that nothing is locked by another process).
- Scope control: If you’ve got noisy artifacts (e.g.,
node_modules
, archives), consider excluding them upstream (e.g., remove from the dataset copy) to keep the index lean.
8) Optional: interactive
mode and watching changes
When you’re iterating a lot:
qgrep interactive "Z:\qgrep_china_index\china.cfg"
This gives you a prompt where you can type queries rapidly without re‑specifying options. For incrementals, just run qgrep update
again whenever the dataset changes.
Wrap‑up
That’s the workflow:
- Map the Synology share.
- Create a dedicated index folder next to the data.
qgrep init
using a.cfg
path on the share.qgrep update
to build the index.qgrep search
with letter options (i
,l
,V
) to hunt quickly