MVC Bulk Edit - Linq to Sql List Save


To get my head round some fundamentals using MVC and Linq to SQL I'm working on an adaption to Stephen Walther's TaskList application:

I'm adding a Bulk Edit system using the concepts described in Steve Sanderson's blog.

This all works as expected, however, I'm having trouble saving my returned List of Tasks. The Post to my BulkEdit loops through the returned list and updates each item in my LinqToSql db list of Tasks.

My BulkEdit view inherits ViewPage<List<TaskList.Models.Task>> and is as follows:

using (Html.BeginForm())
        <div id="items">

            foreach (var task in ViewData.Model)
                    new ViewDataDictionary(ViewData)
                                {"prefix", "tasks"}


        <input type="submit" value="Save changes" />


The TaskEditor control inherits System.Web.Mvc.ViewUserControl<Models.Task> and looks like this:

<%= Html.Hidden(ViewData["prefix"] + ".index", ViewData.Model.Id) %>

<% var fieldPrefix = string.Format("{0}[{1}].", ViewData["prefix"], ViewData.Model.Id); %>

<%= Html.Hidden(fieldPrefix + "Id", ViewData.Model.Id) %>
<%= Html.TextBox(fieldPrefix + "TaskDescription", ViewData.Model.TaskDescription)%>
<%= Html.TextBox(fieldPrefix + "EntryDate", ViewData.Model.EntryDate.ToString("o"))%>   
<%= Html.CheckBox(fieldPrefix + "IsCompleted", ViewData.Model.IsCompleted)%>

The Controller Get and Post methods are as follows:

    public ActionResult BulkEdit()
        var tasks = from t in db.Tasks orderby t.EntryDate descending select t;

        return View(tasks.ToList());

    public ActionResult BulkEdit(IList<Task> tasks)
        foreach(Task task in tasks)
            foreach(Task dbTask in db.Tasks)
                if (dbTask.Id == task.Id)
                    dbTask.TaskDescription = task.TaskDescription;
                    dbTask.EntryDate = task.EntryDate;
                    dbTask.IsCompleted = task.IsCompleted;


        return RedirectToAction("Index");

My question is, this seems too complicated and I haven't yet accounted for tasks being added or deleted from the list. What I would prefer to do is something like

db.Tasks = tasks;

and let Linq do all its magic to work out which ones have changed and which ones are new / old.

Is this possible? Or am I expecting a bit too much from Linq so soon?

By : Robin Day


You're dealing with the fact that LINQ to SQL has no multi-tier story. I think that this is what you're looking for:

The Merge method that this guy presents could be easily turned into an "extension method" of the DataContext class and it would almost be just like it was built right into LINQ to SQL. I say almost because you'd have to include the namespace of where your extension method is located in order to use it.

I think you should consider using a view to accompish what you want. the ObservebleCollection collection to hold the items and then you can have a binding in your list to the view of that collection, you would still need to handle cases of adding and deleing of items, but you can also opt of an IBindingList which will give you a BindingListCollectionView that will update the DB, all you need to do is call the views AddNew() CommitNew() and so on.

HTH, Eric

By : user61477

Unlike autogenerated code from an ORM product, stored procs can be performance tuned. This is critical in large production environment. There are many ways to tweak performance that are not available when using an ORM. Also there are many many tasks performed by a large database which have nothing to do with the user interface and thus should not be run from code produced from there.

Stored procs are also required if you want to control rights so that the users can only do the procedures specified in the proc and nothing else. Otherwise, users can much more easily make unauthorized changes to the databases and commit fraud. This is one reason why database people who work with large business critical systems, do not allow any access except through stored procs.

If you are moving large amounts of data to other servers though, I would consider using DTS (if using SQL Server 2000) or SSIS. This may speed up your processes still further, but it will depend greatly on what you are doing and how.

The fact that sps may be faster in this case doesn't preclude that indexing may be wrong or statistics out of date, but generally dbas who manage large sets of data tend to be pretty on top of this stuff.

It is true the process you describe seems a bit convoluted, but without seeing the structure of what is happening and understanding the database and environment, I can't say if maybe this is the best process.

I can tell you that new employees who come in and want to change working stuff to fit their own personal predjudices tend to be taken less than seriously and then you will have little credibility when you do need to suggest a valid change. This is particularly true when your past experience is not with databases of the same size or type of processing. If you were an expert in large systems, you might be taken more seriously from the start, but, face it, you are not and thus your opinion is not likely to sway anybody until you have been there awhile and they have a measure of your real capabilities. Plus if you learn the system as it is and work with it as it is, you will be in a better position in six months or so to suggest improvements rather than changes.


This video can help you solving your question :)
By: admin