Results 1 to 7 of 7
  1. #1
    Join Date
    Mar 2004
    Location
    Leiria Portugal
    Posts
    14

    Unanswered: query to retrieve the columns that are null in a table

    Hi,

    I need help to build a query that shows me how many columns inside a range on columns are null.
    Example: quantity1;quantity2;quantity3;quantity4;quantity5; quantity6;quantity7;

    Which columns are null?

    Thanks in advance

  2. #2
    Join Date
    Jun 2005
    Posts
    45
    Hi Teixeira,
    I'm not sure what you are asking. If you could supply a table creation script some test data, and what the "result" should be based on the test data, that would help enormously.

    Thanks,
    Cat

  3. #3
    Join Date
    Mar 2004
    Location
    Leiria Portugal
    Posts
    14
    USE [myDB]
    SET ANSI_NULLS ON
    GO
    SET QUOTED_IDENTIFIER ON
    GO
    CREATE TABLE [dbo].[books](
    [book_id] [int] IDENTITY(1,1) NOT NULL,
    [book_description] [nvarchar](max) COLLATE Latin1_General_CI_AS NULL,
    [quantity1] [decimal](18, 2) NOT NULL,
    [quantity2] [decimal](18, 2) NULL,
    [quantity3] [decimal](18, 2) NULL,
    [quantity4] [decimal](18, 2) NULL,
    [quantity5] [decimal](18, 2) NULL,
    [quantity6] [decimal](18, 2) NULL,
    [quantity7] [decimal](18, 2) NULL,
    [quantity8] [decimal](18, 2) NULL,
    [quantity9] [decimal](18, 2) NULL,
    [quantity10] [decimal](18, 2) NULL

    this is my struture adapted.
    based on this, i want to know which columns are not NULL, for my qyery result do not display for example 10 Quantity columns when i have just 3 that have quantities.

  4. #4
    Join Date
    Jan 2004
    Location
    In a large office with bad lighting
    Posts
    1,040
    Do you expect your query to return a single rowset, or is it possible to return multiple rows?

    -- This is all just a Figment of my Imagination --

  5. #5
    Join Date
    Mar 2004
    Location
    Leiria Portugal
    Posts
    14
    yes!
    It can return several rows.
    but its not necessary to return columns that has null or empty values, because it would generated a lot of unnecessary columns in my datagrid display object

  6. #6
    Join Date
    Jun 2005
    Posts
    45
    I would change your structure from this:

    Code:
    CREATE TABLE [dbo].[books](
    [book_id] [int] IDENTITY(1,1) NOT NULL,
    [book_description] [nvarchar](max) COLLATE Latin1_General_CI_AS NULL,
    [quantity1] [decimal](18, 2) NOT NULL,
    [quantity2] [decimal](18, 2) NULL,
    [quantity3] [decimal](18, 2) NULL,
    [quantity4] [decimal](18, 2) NULL,
    [quantity5] [decimal](18, 2) NULL,
    [quantity6] [decimal](18, 2) NULL,
    [quantity7] [decimal](18, 2) NULL,
    [quantity8] [decimal](18, 2) NULL,
    [quantity9] [decimal](18, 2) NULL,
    [quantity10] [decimal](18, 2) NULL)
    to this:

    Code:
    CREATE TABLE [dbo].[books](
    [book_id] [int] IDENTITY(1,1) NOT NULL,
    [book_description] [nvarchar](max) COLLATE Latin1_General_CI_AS NULL)
    GO
    CREATE TABLE [dbo].[bookquantity](
    [book_id] [int] IDENTITY(1,1) NOT NULL,
    [quantity] [decimal](18, 2) NOT NULL)
    GO
    ALTER TABLE [dbo].[bookquantity]
    ADD CONSTRAINT FK_book (book_id) REFERENCE [books] (book_id)
    GO
    This way you are not tied to only 10 quantities and you can don't need to even store the NULL values.

    If you can't change the structure of the table, I would suggest either creating a temp table with the above structure and populating it with the data from the master table so that you can weed out the nulls, or creating a single delimited string which the application can parse through. SQL can't really handle returning a result set with a variable number of fields.

    The first suggestion would yield a result set like:
    book_id quantity
    -------------------
    1 12.70
    1 33.45
    1 9.00

    The second suggestion would yield a result set like:
    book_id quantity_list
    --------------------------
    1 12.70|33.45|9.00

    Hope this helps.
    Cat

  7. #7
    Join Date
    Mar 2004
    Location
    Leiria Portugal
    Posts
    14
    I think you're both ideas are a good solution.
    As i've some data already in the tables, normalize it more as you suggested would'd take me more time, but the second idea solves the problem perfectly.

    Thanks for the help.

    Teixeira

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •