Top 10 Technical requirements for In-Memory Reporting

Discussion in 'Java' started by JTP PR, May 27, 2010.

  1. JTP PR

    JTP PR Guest

    With Gartner's Business Intelligence group listing In-Memory analytics
    at the top of their recommended Business Intelligence requirements for
    enterprise; No wonder In-memory has become a trending topic in BI and
    a new addition to executive shopping lists.

    We thought we’d share with you our top 10 technical requirements, you
    should be asking from an in-memory analytics vendor:

    [10] Be open - allow other applications to connect to it (other than
    the in-memory database provider)

    It’s important to source a non-proprietary database, in-memory
    databases are usually bundled with a visualization tool and it’s
    important that the provider allows other tools to connect to the
    database so that you can maximize your investment in the technology,
    rather than being tied to a proprietary solution.

    [9] Minimal administration overhead

    In-memory analytic tools often introduce some of the same concerns
    that OLAP stores create: namely, they usually create another data
    source, with its own calculations and business definitions. This is
    where tools such as Yellowfin differ from other in-memory approaches:
    existing queries, reports and dashboards automatically take advantage
    of an in-memory database, seamless to users. Administrators are not
    adding calculations and business logic within another layer; they
    reside within the existing meta-data layer for reporting that is
    already built.

    [8] Enterprise scalability

    It is critical to select enterprise class infrastructure that enables
    you to scale your deployment as your users grow.
    All BI solutions must include enterprise administrative features, such
    as usage monitoring, single sign-on and change management; and this is
    just as true for in-memory solutions. It is therefore, critical that
    you choose solutions that can provide enterprise class infrastructure
    that enable you to scale your deployment as your users grow.

    [7] Platform independent

    It’s important when selecting an in-memory database that it be
    platform independent. It should run on any hardware platform (PC, Mac,
    SunSparc, etc.) or software platform (Linux, MacOS, Unix, Windows,
    etc.).

    [6] Data Serialized to disk to enable rapid recovery

    An image of the in-memory database is saved to disk, allowing the
    system to quickly reload the image should the system need to be
    restarted. This means that data can be loaded into your in-memory
    database without the need to place additional strain on your
    production or transactional servers.

    [5] Integration with your existing data warehouse and OLAP cubes

    While some vendors tout in-memory as a way of avoiding building a data
    warehouse, this option usually applies to smaller organizations that
    may only have a single source system. For larger companies that have
    multiple source systems, the data warehouse continues to be the ideal
    place to transform, model and cleanse the data for analysis.

    Look for tools that are designed to integrate with and leverage
    existing BI environments. An in-memory solution that is tightly
    integrated into the visualization tool is critical. However, it is
    equally important that the visualization tool can also access your
    OLAP cubes and data warehouse tables without the need for an in-memory
    middle-layer. Without this option a purely stand-alone in-memory
    solution can lead to yet another version of the truth, adding
    complexity to your BI environment.

    [4] Web-based GUI development and deployment.

    Some in-memory tools are not nearly as web enabled as their
    conventional BI counterparts. This seems to reflect both technology
    immaturity and a tendency to be a niche deployment. However, for
    successful adoption with minimal administrative overhead web based
    development and deployment is critical. Both the visualization tool
    and in-memory database need to be server based deployments to ensure
    that data access security and application upgrades can be easily
    managed. Solutions such as Yellowfin provide a single web based
    platform for delivering your Business Intelligence needs. From
    connection through to design, modeling and visualization, your users
    work within a fully integrated browser application that encourages
    collaboration and an iterative approach to report development -
    leading to analytical applications that meet the needs of your end
    users.

    [3] Server side rather than Client side

    Consider a scenario where users are able to conduct complex queries by
    downloading up to 100 million rows of data to their desktop from many
    data sources, or data feeds from the web. Sure the information can
    then be sliced and diced into reports or users can create BI
    applications on their desktops and share them with colleagues. This
    sounds great in theory but is fraught with danger in practice. Not
    only does this have a massive potential to create data silos but with
    this level of data stored on a laptop, it is free to leave your
    premises and get lost or stolen in the worst case or published without
    any form of governance at best.

    [2] Ensure real time data refresh - incrementally loaded

    Because reporting data is extracted from a source system or a data
    warehouse and then loaded into memory, data latency can be a concern.
    Front-line workers in a customer service centre, for example, need
    near-real-time highly granular (detailed) data. If an in-memory tool
    contains last week’s product inventory data, it’s probably not of use
    to customer service reps. Thus, the suitability of an in-memory tool
    and the success of the deployment may hinge on the degree to which the
    solution can automate scheduled incremental data loads.

    [1] Data security must be of paramount concern

    In memory applications have the potential to expose significantly more
    data to end-users then ever before. This raises security issues
    regarding how data is accessed, where it is stored and who has access
    to that data.

    In determining the best strategy for your in-memory deployment,
    security needs to be foremost in your selection criteria. There are
    two aspects of security. Firstly, where is your data stored and is
    that storage secure? And secondly, who has access to that data store?
    Transactional applications go to great lengths to embed data security
    and access rules – ensure your in-memory database inherits these and
    is not simply a data access free for all.

    Further Information:

    Download publications (Click Link or Cut into Browser, No signup or
    Email required)

    In-Memory Brochure
    http://yellowfin.com.au/Document.i4?DocumentId=104877

    In-Memory Whitepaper
    http://yellowfin.com.au/Document.i4?DocumentId=104879
     
    JTP PR, May 27, 2010
    #1
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Cuthbert
    Replies:
    8
    Views:
    437
    Ancient_Hacker
    Sep 13, 2006
  2. JTP PR
    Replies:
    0
    Views:
    267
    JTP PR
    May 27, 2010
  3. Eric Loos

    Top web reporting tools for .NET?

    Eric Loos, Jun 30, 2003, in forum: ASP .Net Web Controls
    Replies:
    1
    Views:
    105
    Lei Guangfu
    Jul 4, 2003
  4. Mark Probert
    Replies:
    4
    Views:
    331
    Mark Probert
    Feb 9, 2005
  5. JTP PR
    Replies:
    0
    Views:
    111
    JTP PR
    May 27, 2010
Loading...

Share This Page