Searching All Tables in a SQL Server Database for a value.

One of my IT guys was trying to write a SQL query to search for a keyword in every table and column of a particular database. There are tons of solutions out there on how to do this and from looking at what he had done so far, I decided to see how quickly I could figure out one. Ended up being about 15 minutes.

CAUTION: Probably not the fastest or most optimized solution, but fairly simple:

USE My_Database;

-- Store results in a local temp table so that. I'm using a
-- local temp table so that I can access it in SP_EXECUTESQL.
tbl nvarchar(max),
col nvarchar(max),
val nvarchar(max)

DECLARE @tbl nvarchar(max);
DECLARE @col nvarchar(max);
DECLARE @q nvarchar(max);
DECLARE @search nvarchar(max) = 'my search key';

-- Create a cursor on all columns in the database
declare c cursor for

-- For each table and column pair, see if the search value exists.
-- Look for the search key in current table column and if found add it to the results.
SET @q = 'INSERT INTO #tmp SELECT ''' + @tbl + ''', ''' + @col + ''', ' + @col + ' FROM ' + @tbl + ' WHERE ' + @col + ' LIKE ''%' + @search + '%'''

-- Get results
SELECT tbl, col, val FROM #tmp

-- Remove local temp table.

Attaching Taxonomy Terms to Existing Drupal Nodes with SQL

A website I work with has a whole bunch of Taxonomy “keywords” that needed to be assigned to existing content. There’s no way I could use third party modules to assign these terms to the existing content because I’m using large data sets (~500,000). Plus, why have the overhead when I just use SQL to directly access manipulate the data.

Below is a SQL statement that will add a taxonomy association to any content that contains the taxonomy keyword:

INSERT INTO term_node
(nid, vid, tid)
SELECT q.nid, q.nid, t.tid
FROM content_type_yourtype q
JOIN term_data t
-- Make sure you only find full words, not partial matches
ON q.field_type_value LIKE CONCAT('% ',, ' %')
-- Don't try to add existing entries
WHERE NOT EXISTS (SELECT n.nid FROM term_node n WHERE n.nid = q.nid AND n.vid = q.nid AND n.tid = t.tid)
-- Clear out blanks
AND != ''
-- This is optional, I need it because my hosting provider times out on 3+ min queries...
AND q.nid >= 200000 AND q.nid < 300000 )

SCRATCH THIS. You need to setup a job to run this, it's too much work on the database:

PHP Script to use:

= $min AND q.nid < $max ");

Bash Script to automate:

for i in {1..150};
do echo "Running Job $i with $j";
php -e tax_update.php $min `expr $min + $limit`;
min=`expr $min + $limit`;

Manually Fixing / Adding Permissions to node_access Table (Drupal 6)

For some reason one of the Drupal sites that I work on for one of my clients wasn’t adding new information to the node_access table for new content being created. We noticed this because some of the results weren’t showing up in searches. (They won’t if the node doesn’t have grant_view checked.) Anyhow, I created the following query to fix this:

* This statement is just used to check how many need to be
* sync'd and can be ignored. It does let you know how many
* nodes your missing though 😉
SELECT count(*) FROM mydatabase.node n where n.nid not in (
select a.nid from mydatabase.node_access a)
and (n.type = 'content_type_a' or n.type = 'content_type_b');

* Create a temp table to insert all missing permissions.
* Note, this isn't required if you only have one access type,
* but since I three, it is necessary. This tables structure
* is just a replica of node_access table.
create temporary table s (
nid int not null,
gid int,
realm varchar(255),
grant_view tinyint(3),
grant_update tinyint(3),
grant_delete tinyint(3)

* I have three node permission groups that I have to change,
* so I will run three insert statements to populate my temp
* table. There is probably a way to combine this into one
* SQL query, but that's another days lesson.
insert into s
/* This group gets view access */
select n.nid, GID_VAL, 'term_access', 1, 0, 0
from mydatabase.node n
where n.nid not in (
select a.nid from mydatabase.node_access a
and (n.type = 'content_type_a' or n.type = 'content_type_b');

insert into s
/* This group gets view access */
select n.nid, GID_VAL, 'term_access', 1, 0, 0

insert into s
/* This group gets view, update and delete access */
select n.nid, GID_VAL, 'term_access', 1, 1, 1

* Finally add all of the missing node permissions into
* the real node_access table!
insert into mydatabase.node_access
(nid, gid, realm, grant_view, grant_update, grant_delete)
select s.nid, s.gid, s.realm, s.grant_view, s.grant_update, s.grant_delete
from s;

* You might wonder why you need the temp table, it's cause the
* where clause would return no rows after that first INSERT.
* If you only have one GID to worry about, you can forget
* the temp table step.

* Confirm you get 0 results. (No missing nodes)
SELECT count(*) FROM mydatabase.node n where n.nid not in (
select a.nid from mydatabase.node_access a)
and (n.type = 'content_type_a' or n.type = 'content_type_b');

* Remove temp.
drop table s;

Yes, you could use just rebuild all of the permissions through Drupal, but when the site has an extremely large node population, this takes hours.

Adding Column Messes Up Existing SQL View

So I needed to add a new form field to an existing application, which usually requires a new column to a SQL table. Cool, no biggie. Everything is fine and dandy until I go to SharePoint to view the external lists connected to the database, I kept getting errors about string to datetime conversion. WTF…

These external lists were reading from SQL Views, so I took a look at the content coming from the views, and about halfway through the columns everything was shifted right by one. This happened at exactly where I inserted the new column (I didn’t append it to the table). The issue is that when I created the view, I used a select * instead of implicitly stating the columns. When you do this, I guess SQL Server expands * to all existing columns to make “select a, b, c, d from table” and saves THAT as the View query. So when the “select” from view was being called from SharePoint, it was referencing the old “c” column position but now that was different. (NULL for all existing rows)

This should help you to visualize:

1) Created view with select *
2) SQL Server expanded to select a, b, c, d and saved as view.
3) Now by selecting all from view, it’s using: select a, b, c, d from myView

a | b | c | d
1 | 2 | 3 | 4

1) Added column newCol after column b
2) View now still calling using: select a,b, c, d from myView, but now newCol was being used for c, and then c was used:

a | b | c | d
1 | 2 | NULL | 3

AH d is using c column value!! So the rest are now off by one.

To fix, just ran ALTER on the view

But lesson learned, DON’T create views with SELECT *

Updating All SQL Table Rows with a Fallback Default Value

I’m working on merging two systems together and that involves mapping some of the database content in system B to system A. Basically I have a table from system B that has an identifier column that will not map to system A, so I’ve added a new column (new_system_a_id) to that table:

System B Table
nvarchar | old_identifier
int | new_system_a_id

Now I needed a way to map all of system A id’s to System B Table. I created a mapping table with the following:

Mapping Table
nvarchar | name
int | system_a_id

From here I just ran

update system_b_table
set new_system_a_id = (select system_a_id from mapping_table where name = old_identifier)

This worked fine, except that every row in system b’s table that did not have a mapping was set to NULL. To change it to a default value I just used COALESCE function:

/* Update system_b_table with the new mapping. If there is no mapping, default to 555. */
update system_b_table
set new_system_a_id = coalesce(
(select system_a_id from mapping_table where name = old_identifier),

Parsing Full Names into First and Last Columns in MySQL

So the other day I was importing a list of authors into one of my clients databases and I screwed up and put the entire name into the first name column. (Well really because that’s how the file I was parsing was created.) Anyway, my client wanted them separated, which made sense since the other 8000 authors were done this way. To do this I wrote the following query and ran it:

update db.content_type_auth a
# Update the last name first so you don't lose the entire name. You're just taking everything after the first word and setting that as the last name. (+2 to because of ' ')
set a.lname = substring(a.fname, length(substring_index(a.fname, ' ', 1))+2, length(a.fname)),
# Update the first name using the first word
a.fname = substring_index(a.fname, ' ', 1)

# This was needed for my specific case.
where a.nid in
SELECT nid FROM db.node where type = 'auth' and nid > 555555 and nid < 666666 )

The only side effect from the above query was that if the name had any leading white space, the first name became blank and the last name became the whole one. To fix that just add trim() to

# Last name
set a.lname = substring(trim(a.fname), length(substring_index(trim(a.fname), ' ', 1))+2, length(a.fname)),

# First name
a.fname = substring_index(trim(a.fname), ' ', 1)

I also had about 5 that were co-presenters, but that's an entirely different situation.

Allow Nulls Column Not Allowing Modification of SharePoint 2010 ECT

I had created an external content type from an events table in my database. Two of the columns, end_day and start_day, were datetime value types and were set to NOT NULL. I had no problems creating the ECT like this. However, when I later added some columns to the events table and needed them to show up in the “create” operation on the ECT, I couldn’t save the changes. I got the following error:

Data source element end_day is of data type system.datetime, the mapped item end_day is of data type system.nullable
Data source element start_day is of data type system.datetime, the mapped item start_day is of data type system.nullable

To fix this, I just set the columns in the events table to allow nulls.

I don’t understand why I was able to create the initial ECT with columns that did not all null but not able to modify. That’s for another day.

Case Project: Displaying Login Activity in a SharePoint External Content Type Chart

Graphs/Charts are fancy. They are a great way to display statistics from your external content to your users. People love to keep track of things like sex, ethnicity, etc and they would like to have a pleasant way of seeing these results. (e.g. Show me the trends of white males compared to black males that have applied for this program since its inception.) SharePoint to the rescue!

Ok, so what I want to do is show a bar graph of daily logins to an application. Each time a user logins to the external application, a row is inserted into a database table with some information about the login, one item being the login time. SharePoint has a built in web part called “charts” that allow you to easily display pie charts, line graphs, bar graphs, etc with any of your external content data.

What I need to do is convert a table like so

into charts like the following: (daily, monthly and yearly counts)

Really the only “hard” part is creating an external content type from the login table that the charts will be able to read.   I created three views, one for each chart:

Daily Counts:

-- Convert the datetime value into mm/dd/yyyy format
-- sum(1) - For each mach in the group add one
-- COUNT(id) I have to have some sort of primary key for SharePoint, otherwise I can't create the external content type
select CONVERT(VARCHAR(10), submitted, 101) as [Submitted], sum(1) as [Count], COUNT(id) as [id]
from login_tracker
-- Only get the last 7 days
where submitted >= DATEADD(day, DATEDIFF(day, 6, GETDATE()), 0)
-- Group together each row that has the same mm/dd/yyyy
group by CONVERT(VARCHAR(10), submitted, 101)

Monthly Counts:

-- Now we're converting the datetime stamp to 1 - 12.
select month(CONVERT(VARCHAR(10), submitted, 101)) as [monthnum], DATENAME(month, CONVERT(VARCHAR(10), submitted, 101)) as [Submitted], sum(1) as [Count], COUNT(id) as [id]
from login_tracker
-- Only get data from the current year.
where submitted >= year(GETDATE())
group by DATENAME(month, CONVERT(VARCHAR(10), submitted, 101)), month(CONVERT(VARCHAR(10), submitted, 101))

Yearly Logins:

-- Convert datetime to yyyy format (e.g. 2012)
select DATENAME(year, CONVERT(VARCHAR(10), submitted, 101)) as [Submitted], sum(1) as [Count], COUNT(id) as [id]
from login_tracker
group by DATENAME(year, CONVERT(VARCHAR(10), submitted, 101))

From here you just make an ECT from each view and connect it to a chart!