Log in Register FAQ Memberlist Search pcHDTV Forum Index
pcHDTV Forum

pcHDTV Forum Index -> mythTV wiith the HD-2000 card -> Problems playing 720p recording on 1080i display using xvmc Goto page 1, 2  Next
Post new topic  This topic is locked: you cannot edit posts or make replies. View previous topic :: View next topic 
Problems playing 720p recording on 1080i display using xvmc
PostPosted: Sat Jan 28, 2006 10:48 am Reply with quote
marc.aronson
 
Joined: 28 Jan 2006
Posts: 5




Hi:

I'm having a problem that I am hoping someone can help me with. First, my setup:

knoppmyth R5A30.2 => mythtv 0.18.2
P4-2.4ghz, 512MB ram
FX5200 card, nvidia driver version 6629
Display device: Samsung DLP HLR5067w (720p native)
Connection via VGA

When I play a 1080i recording with XVMC enabled the video quality is perfect and CPU utilization is 60%.

When I play 1 720p recording with XVMC enabled the video jumps around a lot. Very jerky. If I turn off XVMC and play through libmpeg2, the playback is perfect but CPU utilization is 90%.

My goal is the use XVMC for both 720p and 1080i playback.

I believe the problem is that the XVMC processing path is not properly dealing with the display of 720p material on a 1080i display device. Does anyone have any ideas or thoughts of how I could address this?

BTW, I've tried nvidia driver versions 7174, 8174 and 8178, but all shared two show-stopping problems: Both produced a blank screen when used with my 1080i modeline; when used with my 720p modeline, my CPU would max out and video playback was jerky. It seems like the 6629 version is the only one that "works" for me...

Thanks for any help you can provide!


Marc


Last edited by marc.aronson on Sat Mar 11, 2006 9:39 am; edited 1 time in total
View user's profile Send private message
PostPosted: Sun Jan 29, 2006 3:55 am Reply with quote
kmj0577
 
Joined: 03 Jan 2006
Posts: 57




Do you mean jerky as in dropped frames? Or jerky as in it's being bob deinterlaced (assuming you have that enabled on the XvMC setup page) and jumping up and down because of that?
View user's profile Send private message
PostPosted: Sun Jan 29, 2006 9:27 am Reply with quote
marc.aronson
 
Joined: 28 Jan 2006
Posts: 5




Thanks for replying. To answer your questions:

1. I don't have any de-interlacing turned on. My assumption is that I should not be de-interlacing because I am running my display as an interlaced display (1080i). 1080i interlaced material plays fine without deinterlacing, as expected. The problem only occurs with 720p (progressive scan) material.

2. It does not look like dropped frames. The "jumping up and down" description is a good one for what I am seeing.

You mentioned a deinterlace option on the XVMC setup page. I don't see an XVMC setup page. The deinterlace option I am looking at is on the first screen of "setup->TV Settings->Playback". This screen is labeled "General Playback". Is there another screen or configuration file I should be looking at for XVMC setup options?

marc
View user's profile Send private message
DVI?
PostPosted: Sun Jan 29, 2006 3:39 pm Reply with quote
mkinn
 
Joined: 17 Jun 2005
Posts: 5
Location: Birmingham, Alabama




Does your 5200 have DVI output?
You should get a card with DVI out. You would see a large increase in picture quality, as you would avoid Digital-to Analog conversion (to VGA) and Analog-to Digital conversion inside the Samsung. This is true for ALL digital displays. You should ALWAYS output the NATIVE scan rate of the display, in this case, 720.

Good luck with the driver, but you may do better with another display card with DVI and different drivers.

_________________
Michael A Kinnaird
Senior ISF Calibrator - all brands
HTPC: ATi HD Wonder/All In Wonder 9600XT
View user's profile Send private message Send e-mail Visit poster's website
PostPosted: Sun Jan 29, 2006 5:36 pm Reply with quote
marc.aronson
 
Joined: 28 Jan 2006
Posts: 5




Mkinn, from everything I have read the Samsung I am using is capable of 1080i. Since many of my channels are transmitted in 1080i format, I believe I am better off displaying them at 1080i. I've tried the experiment:

1. When I display 1080i material using my 720p modeline, and de-interlace, I can see (minor) artifacts from the de-interlacing.

2. When I display 1080i material using my 1080i modeline, no de-interlace, the results are excellant. No artifacts.

3. When i display 720p material using my 720p modeline, no de-interlace, the results are excellant.

The only problem is displaying 720P material using my 1080i modeline with XVMC enabled.

In terms of VGA vs. DVI -- the DVI connection does not support 1080i. VGA supports both 720p and 1080i. I've seem several posts from people who have moved from DVI to VGA when using the samsung so they can access both modes...

Marc
View user's profile Send private message
PostPosted: Sun Jan 29, 2006 6:53 pm Reply with quote
Scott Larson
 
Joined: 15 Oct 2003
Posts: 713
Location: Portland, OR




Have you tried playing these files with an XvMC-enabled version of xine or mplayer? This would narrow down where the problem lies.

It sounds like your interlaced display is displaying each progressive frame as a field.
View user's profile Send private message
Problems playing 720p recording on 1080i display using xvmc
PostPosted: Sun Jan 29, 2006 9:42 pm Reply with quote
mkinn
 
Joined: 17 Jun 2005
Posts: 5
Location: Birmingham, Alabama




Your Samsung, and ALL DLP RPTV's are 1280X720, except the new Mitsubishi WD-52627 and WD-73727. Therefore, you should send it 720P thru whatever connection you use. This eliminates a scaling step in the video processing circuitry. If you send it DVI, that eliminates 2 more steps, resulting in the very best picture your display can produce.
The only exception would be a set-top-box (Dish or DirecTV). They should be set to NATIVE, because your TV will do a better job of converting 1080i to 720P than the cheap electronics in those boxes.
Your computer will do even better than your TV, on this conversion. Proven fact, no debate needed. The chips on your video card, along with the algorithms needed to do the scaling are far better in the computer than the ones in the TV. The only processor that can do a better job would be a Faroudja or possibly a Lumagen HD PRO ($1,500.00), and only because of the MPEG (Mosquito) noise reduction and de-interlacing algorithms in those processors. They can clean up some pretty bad video.

_________________
Michael A Kinnaird
Senior ISF Calibrator - all brands
HTPC: ATi HD Wonder/All In Wonder 9600XT
View user's profile Send private message Send e-mail Visit poster's website
PostPosted: Sun Jan 29, 2006 11:46 pm Reply with quote
marc.aronson
 
Joined: 28 Jan 2006
Posts: 5




mkinn, thank you for the explanation. I accept your explanation about the advantages over sending output at 720p instead of 1080i with my TV set. Having said this, I am still facing a problem. Let me explain.

My PC is a 2.4ghz P4. What I am finding is that without XVMC enabled, my system is slightly underpowered. When I turn on XVMC, here is what happens:

1. source=720p; modeline=720p: Video looks great!

2. source=1080i; modeline-1080i: Video looks great!

3. Source=720p; modeline=1080i: Vidoe is very jumpy and unwatchable. My guess is that the processing path in the PC is not converting properly.

4. source-1080i; modeline-720p: The video is watchable but looks like it has not been de-interlaced. I've tried using all the the de-interlace options, including "bob".

Up until now I've been trying to find a solution for scenario#3, as I've been using my 1080i modeline. Given your explanation, it's clear I should be using my 720p modeline and trying to fix scenario # 4. Any suggestions on how to address this? I am using Nvidia driver version 6629, as I've found that the later versions use a lot more CPU.

Thanks!

Marc
View user's profile Send private message
Re: Problems playing 720p recording on 1080i display using x
PostPosted: Sun Jan 29, 2006 11:55 pm Reply with quote
Scott Larson
 
Joined: 15 Oct 2003
Posts: 713
Location: Portland, OR




mkinn wrote:
Your computer will do even better than your TV, on this conversion. Proven fact, no debate needed. The chips on your video card, along with the algorithms needed to do the scaling are far better in the computer than the ones in the TV.

The only hardware deinterlacing the FX5200 card supports is bob deinterlacing, the cheapest simplest method available.
View user's profile Send private message
Problems playing 720p recording on 1080i display using xvmc
PostPosted: Mon Jan 30, 2006 12:01 am Reply with quote
mkinn
 
Joined: 17 Jun 2005
Posts: 5
Location: Birmingham, Alabama




I hate to sound obvious, but you don't need to de-interlace 720P, and I think the scaling should be automatic, with no adjustments needed. Do ANY of the driver versions work at all?
Also, PBS and some other sources are 59.94 hz vertical refresh, not 60 hx. I think this will result in a dropped frame and a brief stutter every 16 seconds or so.

_________________
Michael A Kinnaird
Senior ISF Calibrator - all brands
HTPC: ATi HD Wonder/All In Wonder 9600XT
View user's profile Send private message Send e-mail Visit poster's website
PostPosted: Mon Jan 30, 2006 12:15 am Reply with quote
marc.aronson
 
Joined: 28 Jan 2006
Posts: 5




mkminn, you are correct that 720p material does not need to be de-interlaced. The issue is when I am playing 1080i material. Then it needs to be deinterlaced. Under these circumstancs, Scott's observations about bob-deinterlacing is critical. It sounds like scenario # 4 may not be worth pursuing (getting 1080i recordings to deinterlace and play well in 720p display mode with xvmc enabled.)

Version 7174, 8174 & 8178 drive CPU usage too high for me to comment on weather or not they work in general. They won't work for me with a 2.4ghz P4.

Sounds like there are 3 possible approachs to pursue:

1. Find a way to solve the "jerky video" problem when playing 720p material with a 1080i modeline.

2. Find a way to auto-switch between using a 1080i modeline and an 720p modeline, based on the resolution of the recording.

3. Is there an Nvidia card that supports a better de-interlacing choice than "bob" when xvmc is enabled? I am still within my 30-day return period on the FX5200, and I could easily trade it for something better.

I remain open to any ideas -- thanks for your tme!

Marc
View user's profile Send private message
PostPosted: Tue Jan 31, 2006 10:34 am Reply with quote
kmj0577
 
Joined: 03 Jan 2006
Posts: 57




marc.aronson wrote:
mkminn, you are correct that 720p material does not need to be de-interlaced. The issue is when I am playing 1080i material. Then it needs to be deinterlaced. Under these circumstancs, Scott's observations about bob-deinterlacing is critical. It sounds like scenario # 4 may not be worth pursuing (getting 1080i recordings to deinterlace and play well in 720p display mode with xvmc enabled.)

Version 7174, 8174 & 8178 drive CPU usage too high for me to comment on weather or not they work in general. They won't work for me with a 2.4ghz P4.

Sounds like there are 3 possible approachs to pursue:

1. Find a way to solve the "jerky video" problem when playing 720p material with a 1080i modeline.

2. Find a way to auto-switch between using a 1080i modeline and an 720p modeline, based on the resolution of the recording.

3. Is there an Nvidia card that supports a better de-interlacing choice than "bob" when xvmc is enabled? I am still within my 30-day return period on the FX5200, and I could easily trade it for something better.

I remain open to any ideas -- thanks for your tme!

Marc

The 7800GTX has spatial deinterlacing tweaked for 1080i.

Looks like the 6600 has that too, but the 7800 is supposed to be smoother.
View user's profile Send private message
PostPosted: Tue Jan 31, 2006 11:29 am Reply with quote
Scott Larson
 
Joined: 15 Oct 2003
Posts: 713
Location: Portland, OR




Is the new spatial deinterlacing supported in the XvMC API?
View user's profile Send private message
PostPosted: Tue Jan 31, 2006 11:38 am Reply with quote
kmj0577
 
Joined: 03 Jan 2006
Posts: 57




Scott Larson wrote:
Is the new spatial deinterlacing supported in the XvMC API?

That I'm not sure of. I don't think it is as of yet.

Can't say for sure since I've only got a 5200 as well Laughing
View user's profile Send private message
PostPosted: Tue Jan 31, 2006 12:14 pm Reply with quote
Scott Larson
 
Joined: 15 Oct 2003
Posts: 713
Location: Portland, OR




The XvMC API doesn't really specify a deinterlacing technique. When you tell it to display top fields and bottom fields from a frame, it's up to the hardware to decide how to deinterlace (and apparently it's not even required). Maybe the 7800GTX will do spatial deinterlacing by default.

But with Nvidia's Linux drivers, I wouldn't count on it. Confused
View user's profile Send private message
Problems playing 720p recording on 1080i display using xvmc
  pcHDTV Forum Index -> mythTV wiith the HD-2000 card
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum
All times are GMT - 7 Hours  
Page 1 of 2  
Goto page 1, 2  Next
  
  
 Post new topic  This topic is locked: you cannot edit posts or make replies.  


Powered by phpBB © 2001-2003 phpBB Group
Theme created by Vjacheslav Trushkin